Oct 01 13:00:41 localhost kernel: Linux version 5.14.0-617.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-11), GNU ld version 2.35.2-67.el9) #1 SMP PREEMPT_DYNAMIC Mon Sep 15 21:46:13 UTC 2025
Oct 01 13:00:41 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Oct 01 13:00:41 localhost kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-617.el9.x86_64 root=UUID=d6a81468-b74c-4055-b485-def635ab40f8 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Oct 01 13:00:41 localhost kernel: BIOS-provided physical RAM map:
Oct 01 13:00:41 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Oct 01 13:00:41 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Oct 01 13:00:41 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Oct 01 13:00:41 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Oct 01 13:00:41 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Oct 01 13:00:41 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Oct 01 13:00:41 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Oct 01 13:00:41 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Oct 01 13:00:41 localhost kernel: NX (Execute Disable) protection: active
Oct 01 13:00:41 localhost kernel: APIC: Static calls initialized
Oct 01 13:00:41 localhost kernel: SMBIOS 2.8 present.
Oct 01 13:00:41 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Oct 01 13:00:41 localhost kernel: Hypervisor detected: KVM
Oct 01 13:00:41 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Oct 01 13:00:41 localhost kernel: kvm-clock: using sched offset of 3964128861 cycles
Oct 01 13:00:41 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Oct 01 13:00:41 localhost kernel: tsc: Detected 2800.000 MHz processor
Oct 01 13:00:41 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Oct 01 13:00:41 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Oct 01 13:00:41 localhost kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Oct 01 13:00:41 localhost kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Oct 01 13:00:41 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Oct 01 13:00:41 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Oct 01 13:00:41 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Oct 01 13:00:41 localhost kernel: Using GB pages for direct mapping
Oct 01 13:00:41 localhost kernel: RAMDISK: [mem 0x2d7d0000-0x32bdffff]
Oct 01 13:00:41 localhost kernel: ACPI: Early table checksum verification disabled
Oct 01 13:00:41 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Oct 01 13:00:41 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct 01 13:00:41 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct 01 13:00:41 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct 01 13:00:41 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Oct 01 13:00:41 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct 01 13:00:41 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct 01 13:00:41 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Oct 01 13:00:41 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Oct 01 13:00:41 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Oct 01 13:00:41 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Oct 01 13:00:41 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Oct 01 13:00:41 localhost kernel: No NUMA configuration found
Oct 01 13:00:41 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Oct 01 13:00:41 localhost kernel: NODE_DATA(0) allocated [mem 0x23ffd5000-0x23fffffff]
Oct 01 13:00:41 localhost kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Oct 01 13:00:41 localhost kernel: Zone ranges:
Oct 01 13:00:41 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Oct 01 13:00:41 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Oct 01 13:00:41 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000023fffffff]
Oct 01 13:00:41 localhost kernel:   Device   empty
Oct 01 13:00:41 localhost kernel: Movable zone start for each node
Oct 01 13:00:41 localhost kernel: Early memory node ranges
Oct 01 13:00:41 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Oct 01 13:00:41 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Oct 01 13:00:41 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000023fffffff]
Oct 01 13:00:41 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Oct 01 13:00:41 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Oct 01 13:00:41 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Oct 01 13:00:41 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Oct 01 13:00:41 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Oct 01 13:00:41 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Oct 01 13:00:41 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Oct 01 13:00:41 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Oct 01 13:00:41 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Oct 01 13:00:41 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Oct 01 13:00:41 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Oct 01 13:00:41 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Oct 01 13:00:41 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Oct 01 13:00:41 localhost kernel: TSC deadline timer available
Oct 01 13:00:41 localhost kernel: CPU topo: Max. logical packages:   8
Oct 01 13:00:41 localhost kernel: CPU topo: Max. logical dies:       8
Oct 01 13:00:41 localhost kernel: CPU topo: Max. dies per package:   1
Oct 01 13:00:41 localhost kernel: CPU topo: Max. threads per core:   1
Oct 01 13:00:41 localhost kernel: CPU topo: Num. cores per package:     1
Oct 01 13:00:41 localhost kernel: CPU topo: Num. threads per package:   1
Oct 01 13:00:41 localhost kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Oct 01 13:00:41 localhost kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Oct 01 13:00:41 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Oct 01 13:00:41 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Oct 01 13:00:41 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Oct 01 13:00:41 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Oct 01 13:00:41 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Oct 01 13:00:41 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Oct 01 13:00:41 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Oct 01 13:00:41 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Oct 01 13:00:41 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Oct 01 13:00:41 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Oct 01 13:00:41 localhost kernel: Booting paravirtualized kernel on KVM
Oct 01 13:00:41 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Oct 01 13:00:41 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Oct 01 13:00:41 localhost kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Oct 01 13:00:41 localhost kernel: pcpu-alloc: s225280 r8192 d28672 u262144 alloc=1*2097152
Oct 01 13:00:41 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Oct 01 13:00:41 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Oct 01 13:00:41 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-617.el9.x86_64 root=UUID=d6a81468-b74c-4055-b485-def635ab40f8 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Oct 01 13:00:41 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-617.el9.x86_64", will be passed to user space.
Oct 01 13:00:41 localhost kernel: random: crng init done
Oct 01 13:00:41 localhost kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Oct 01 13:00:41 localhost kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Oct 01 13:00:41 localhost kernel: Fallback order for Node 0: 0 
Oct 01 13:00:41 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Oct 01 13:00:41 localhost kernel: Policy zone: Normal
Oct 01 13:00:41 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Oct 01 13:00:41 localhost kernel: software IO TLB: area num 8.
Oct 01 13:00:41 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Oct 01 13:00:41 localhost kernel: ftrace: allocating 49329 entries in 193 pages
Oct 01 13:00:41 localhost kernel: ftrace: allocated 193 pages with 3 groups
Oct 01 13:00:41 localhost kernel: Dynamic Preempt: voluntary
Oct 01 13:00:41 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Oct 01 13:00:41 localhost kernel: rcu:         RCU event tracing is enabled.
Oct 01 13:00:41 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Oct 01 13:00:41 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Oct 01 13:00:41 localhost kernel:         Rude variant of Tasks RCU enabled.
Oct 01 13:00:41 localhost kernel:         Tracing variant of Tasks RCU enabled.
Oct 01 13:00:41 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Oct 01 13:00:41 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Oct 01 13:00:41 localhost kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Oct 01 13:00:41 localhost kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Oct 01 13:00:41 localhost kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Oct 01 13:00:41 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Oct 01 13:00:41 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Oct 01 13:00:41 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Oct 01 13:00:41 localhost kernel: Console: colour VGA+ 80x25
Oct 01 13:00:41 localhost kernel: printk: console [ttyS0] enabled
Oct 01 13:00:41 localhost kernel: ACPI: Core revision 20230331
Oct 01 13:00:41 localhost kernel: APIC: Switch to symmetric I/O mode setup
Oct 01 13:00:41 localhost kernel: x2apic enabled
Oct 01 13:00:41 localhost kernel: APIC: Switched APIC routing to: physical x2apic
Oct 01 13:00:41 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Oct 01 13:00:41 localhost kernel: Calibrating delay loop (skipped) preset value.. 5600.00 BogoMIPS (lpj=2800000)
Oct 01 13:00:41 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Oct 01 13:00:41 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Oct 01 13:00:41 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Oct 01 13:00:41 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Oct 01 13:00:41 localhost kernel: Spectre V2 : Mitigation: Retpolines
Oct 01 13:00:41 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Oct 01 13:00:41 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Oct 01 13:00:41 localhost kernel: RETBleed: Mitigation: untrained return thunk
Oct 01 13:00:41 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Oct 01 13:00:41 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Oct 01 13:00:41 localhost kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Oct 01 13:00:41 localhost kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Oct 01 13:00:41 localhost kernel: x86/bugs: return thunk changed
Oct 01 13:00:41 localhost kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Oct 01 13:00:41 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Oct 01 13:00:41 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Oct 01 13:00:41 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Oct 01 13:00:41 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Oct 01 13:00:41 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Oct 01 13:00:41 localhost kernel: Freeing SMP alternatives memory: 40K
Oct 01 13:00:41 localhost kernel: pid_max: default: 32768 minimum: 301
Oct 01 13:00:41 localhost kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Oct 01 13:00:41 localhost kernel: landlock: Up and running.
Oct 01 13:00:41 localhost kernel: Yama: becoming mindful.
Oct 01 13:00:41 localhost kernel: SELinux:  Initializing.
Oct 01 13:00:41 localhost kernel: LSM support for eBPF active
Oct 01 13:00:41 localhost kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Oct 01 13:00:41 localhost kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Oct 01 13:00:41 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Oct 01 13:00:41 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Oct 01 13:00:41 localhost kernel: ... version:                0
Oct 01 13:00:41 localhost kernel: ... bit width:              48
Oct 01 13:00:41 localhost kernel: ... generic registers:      6
Oct 01 13:00:41 localhost kernel: ... value mask:             0000ffffffffffff
Oct 01 13:00:41 localhost kernel: ... max period:             00007fffffffffff
Oct 01 13:00:41 localhost kernel: ... fixed-purpose events:   0
Oct 01 13:00:41 localhost kernel: ... event mask:             000000000000003f
Oct 01 13:00:41 localhost kernel: signal: max sigframe size: 1776
Oct 01 13:00:41 localhost kernel: rcu: Hierarchical SRCU implementation.
Oct 01 13:00:41 localhost kernel: rcu:         Max phase no-delay instances is 400.
Oct 01 13:00:41 localhost kernel: smp: Bringing up secondary CPUs ...
Oct 01 13:00:41 localhost kernel: smpboot: x86: Booting SMP configuration:
Oct 01 13:00:41 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Oct 01 13:00:41 localhost kernel: smp: Brought up 1 node, 8 CPUs
Oct 01 13:00:41 localhost kernel: smpboot: Total of 8 processors activated (44800.00 BogoMIPS)
Oct 01 13:00:41 localhost kernel: node 0 deferred pages initialised in 28ms
Oct 01 13:00:41 localhost kernel: Memory: 7765468K/8388068K available (16384K kernel code, 5784K rwdata, 13988K rodata, 4072K init, 7304K bss, 616476K reserved, 0K cma-reserved)
Oct 01 13:00:41 localhost kernel: devtmpfs: initialized
Oct 01 13:00:41 localhost kernel: x86/mm: Memory block size: 128MB
Oct 01 13:00:41 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Oct 01 13:00:41 localhost kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear)
Oct 01 13:00:41 localhost kernel: pinctrl core: initialized pinctrl subsystem
Oct 01 13:00:41 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Oct 01 13:00:41 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Oct 01 13:00:41 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Oct 01 13:00:41 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Oct 01 13:00:41 localhost kernel: audit: initializing netlink subsys (disabled)
Oct 01 13:00:41 localhost kernel: audit: type=2000 audit(1759323639.637:1): state=initialized audit_enabled=0 res=1
Oct 01 13:00:41 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Oct 01 13:00:41 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Oct 01 13:00:41 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Oct 01 13:00:41 localhost kernel: cpuidle: using governor menu
Oct 01 13:00:41 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Oct 01 13:00:41 localhost kernel: PCI: Using configuration type 1 for base access
Oct 01 13:00:41 localhost kernel: PCI: Using configuration type 1 for extended access
Oct 01 13:00:41 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Oct 01 13:00:41 localhost kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Oct 01 13:00:41 localhost kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Oct 01 13:00:41 localhost kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Oct 01 13:00:41 localhost kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Oct 01 13:00:41 localhost kernel: Demotion targets for Node 0: null
Oct 01 13:00:41 localhost kernel: cryptd: max_cpu_qlen set to 1000
Oct 01 13:00:41 localhost kernel: ACPI: Added _OSI(Module Device)
Oct 01 13:00:41 localhost kernel: ACPI: Added _OSI(Processor Device)
Oct 01 13:00:41 localhost kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Oct 01 13:00:41 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Oct 01 13:00:41 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Oct 01 13:00:41 localhost kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC
Oct 01 13:00:41 localhost kernel: ACPI: Interpreter enabled
Oct 01 13:00:41 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Oct 01 13:00:41 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Oct 01 13:00:41 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Oct 01 13:00:41 localhost kernel: PCI: Using E820 reservations for host bridge windows
Oct 01 13:00:41 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Oct 01 13:00:41 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Oct 01 13:00:41 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Oct 01 13:00:41 localhost kernel: acpiphp: Slot [3] registered
Oct 01 13:00:41 localhost kernel: acpiphp: Slot [4] registered
Oct 01 13:00:41 localhost kernel: acpiphp: Slot [5] registered
Oct 01 13:00:41 localhost kernel: acpiphp: Slot [6] registered
Oct 01 13:00:41 localhost kernel: acpiphp: Slot [7] registered
Oct 01 13:00:41 localhost kernel: acpiphp: Slot [8] registered
Oct 01 13:00:41 localhost kernel: acpiphp: Slot [9] registered
Oct 01 13:00:41 localhost kernel: acpiphp: Slot [10] registered
Oct 01 13:00:41 localhost kernel: acpiphp: Slot [11] registered
Oct 01 13:00:41 localhost kernel: acpiphp: Slot [12] registered
Oct 01 13:00:41 localhost kernel: acpiphp: Slot [13] registered
Oct 01 13:00:41 localhost kernel: acpiphp: Slot [14] registered
Oct 01 13:00:41 localhost kernel: acpiphp: Slot [15] registered
Oct 01 13:00:41 localhost kernel: acpiphp: Slot [16] registered
Oct 01 13:00:41 localhost kernel: acpiphp: Slot [17] registered
Oct 01 13:00:41 localhost kernel: acpiphp: Slot [18] registered
Oct 01 13:00:41 localhost kernel: acpiphp: Slot [19] registered
Oct 01 13:00:41 localhost kernel: acpiphp: Slot [20] registered
Oct 01 13:00:41 localhost kernel: acpiphp: Slot [21] registered
Oct 01 13:00:41 localhost kernel: acpiphp: Slot [22] registered
Oct 01 13:00:41 localhost kernel: acpiphp: Slot [23] registered
Oct 01 13:00:41 localhost kernel: acpiphp: Slot [24] registered
Oct 01 13:00:41 localhost kernel: acpiphp: Slot [25] registered
Oct 01 13:00:41 localhost kernel: acpiphp: Slot [26] registered
Oct 01 13:00:41 localhost kernel: acpiphp: Slot [27] registered
Oct 01 13:00:41 localhost kernel: acpiphp: Slot [28] registered
Oct 01 13:00:41 localhost kernel: acpiphp: Slot [29] registered
Oct 01 13:00:41 localhost kernel: acpiphp: Slot [30] registered
Oct 01 13:00:41 localhost kernel: acpiphp: Slot [31] registered
Oct 01 13:00:41 localhost kernel: PCI host bridge to bus 0000:00
Oct 01 13:00:41 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Oct 01 13:00:41 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Oct 01 13:00:41 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Oct 01 13:00:41 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Oct 01 13:00:41 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Oct 01 13:00:41 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Oct 01 13:00:41 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Oct 01 13:00:41 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Oct 01 13:00:41 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Oct 01 13:00:41 localhost kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Oct 01 13:00:41 localhost kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Oct 01 13:00:41 localhost kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Oct 01 13:00:41 localhost kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Oct 01 13:00:41 localhost kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Oct 01 13:00:41 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Oct 01 13:00:41 localhost kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Oct 01 13:00:41 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Oct 01 13:00:41 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Oct 01 13:00:41 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Oct 01 13:00:41 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Oct 01 13:00:41 localhost kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Oct 01 13:00:41 localhost kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Oct 01 13:00:41 localhost kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Oct 01 13:00:41 localhost kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Oct 01 13:00:41 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Oct 01 13:00:41 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Oct 01 13:00:41 localhost kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Oct 01 13:00:41 localhost kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Oct 01 13:00:41 localhost kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Oct 01 13:00:41 localhost kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Oct 01 13:00:41 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Oct 01 13:00:41 localhost kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Oct 01 13:00:41 localhost kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Oct 01 13:00:41 localhost kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Oct 01 13:00:41 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Oct 01 13:00:41 localhost kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Oct 01 13:00:41 localhost kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Oct 01 13:00:41 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Oct 01 13:00:41 localhost kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Oct 01 13:00:41 localhost kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Oct 01 13:00:41 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Oct 01 13:00:41 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Oct 01 13:00:41 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Oct 01 13:00:41 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Oct 01 13:00:41 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Oct 01 13:00:41 localhost kernel: iommu: Default domain type: Translated
Oct 01 13:00:41 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Oct 01 13:00:41 localhost kernel: SCSI subsystem initialized
Oct 01 13:00:41 localhost kernel: ACPI: bus type USB registered
Oct 01 13:00:41 localhost kernel: usbcore: registered new interface driver usbfs
Oct 01 13:00:41 localhost kernel: usbcore: registered new interface driver hub
Oct 01 13:00:41 localhost kernel: usbcore: registered new device driver usb
Oct 01 13:00:41 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Oct 01 13:00:41 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Oct 01 13:00:41 localhost kernel: PTP clock support registered
Oct 01 13:00:41 localhost kernel: EDAC MC: Ver: 3.0.0
Oct 01 13:00:41 localhost kernel: NetLabel: Initializing
Oct 01 13:00:41 localhost kernel: NetLabel:  domain hash size = 128
Oct 01 13:00:41 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Oct 01 13:00:41 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Oct 01 13:00:41 localhost kernel: PCI: Using ACPI for IRQ routing
Oct 01 13:00:41 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Oct 01 13:00:41 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Oct 01 13:00:41 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Oct 01 13:00:41 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Oct 01 13:00:41 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Oct 01 13:00:41 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Oct 01 13:00:41 localhost kernel: vgaarb: loaded
Oct 01 13:00:41 localhost kernel: clocksource: Switched to clocksource kvm-clock
Oct 01 13:00:41 localhost kernel: VFS: Disk quotas dquot_6.6.0
Oct 01 13:00:41 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Oct 01 13:00:41 localhost kernel: pnp: PnP ACPI init
Oct 01 13:00:41 localhost kernel: pnp 00:03: [dma 2]
Oct 01 13:00:41 localhost kernel: pnp: PnP ACPI: found 5 devices
Oct 01 13:00:41 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Oct 01 13:00:41 localhost kernel: NET: Registered PF_INET protocol family
Oct 01 13:00:41 localhost kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Oct 01 13:00:41 localhost kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Oct 01 13:00:41 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Oct 01 13:00:41 localhost kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Oct 01 13:00:41 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Oct 01 13:00:41 localhost kernel: TCP: Hash tables configured (established 65536 bind 65536)
Oct 01 13:00:41 localhost kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Oct 01 13:00:41 localhost kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Oct 01 13:00:41 localhost kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Oct 01 13:00:41 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Oct 01 13:00:41 localhost kernel: NET: Registered PF_XDP protocol family
Oct 01 13:00:41 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Oct 01 13:00:41 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Oct 01 13:00:41 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Oct 01 13:00:41 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Oct 01 13:00:41 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Oct 01 13:00:41 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Oct 01 13:00:41 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Oct 01 13:00:41 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Oct 01 13:00:41 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x140 took 72639 usecs
Oct 01 13:00:41 localhost kernel: PCI: CLS 0 bytes, default 64
Oct 01 13:00:41 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Oct 01 13:00:41 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Oct 01 13:00:41 localhost kernel: ACPI: bus type thunderbolt registered
Oct 01 13:00:41 localhost kernel: Trying to unpack rootfs image as initramfs...
Oct 01 13:00:41 localhost kernel: Initialise system trusted keyrings
Oct 01 13:00:41 localhost kernel: Key type blacklist registered
Oct 01 13:00:41 localhost kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Oct 01 13:00:41 localhost kernel: zbud: loaded
Oct 01 13:00:41 localhost kernel: integrity: Platform Keyring initialized
Oct 01 13:00:41 localhost kernel: integrity: Machine keyring initialized
Oct 01 13:00:41 localhost kernel: Freeing initrd memory: 86080K
Oct 01 13:00:41 localhost kernel: NET: Registered PF_ALG protocol family
Oct 01 13:00:41 localhost kernel: xor: automatically using best checksumming function   avx       
Oct 01 13:00:41 localhost kernel: Key type asymmetric registered
Oct 01 13:00:41 localhost kernel: Asymmetric key parser 'x509' registered
Oct 01 13:00:41 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Oct 01 13:00:41 localhost kernel: io scheduler mq-deadline registered
Oct 01 13:00:41 localhost kernel: io scheduler kyber registered
Oct 01 13:00:41 localhost kernel: io scheduler bfq registered
Oct 01 13:00:41 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Oct 01 13:00:41 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Oct 01 13:00:41 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Oct 01 13:00:41 localhost kernel: ACPI: button: Power Button [PWRF]
Oct 01 13:00:41 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Oct 01 13:00:41 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Oct 01 13:00:41 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Oct 01 13:00:41 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Oct 01 13:00:41 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Oct 01 13:00:41 localhost kernel: Non-volatile memory driver v1.3
Oct 01 13:00:41 localhost kernel: rdac: device handler registered
Oct 01 13:00:41 localhost kernel: hp_sw: device handler registered
Oct 01 13:00:41 localhost kernel: emc: device handler registered
Oct 01 13:00:41 localhost kernel: alua: device handler registered
Oct 01 13:00:41 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Oct 01 13:00:41 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Oct 01 13:00:41 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Oct 01 13:00:41 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Oct 01 13:00:41 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Oct 01 13:00:41 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Oct 01 13:00:41 localhost kernel: usb usb1: Product: UHCI Host Controller
Oct 01 13:00:41 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-617.el9.x86_64 uhci_hcd
Oct 01 13:00:41 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Oct 01 13:00:41 localhost kernel: hub 1-0:1.0: USB hub found
Oct 01 13:00:41 localhost kernel: hub 1-0:1.0: 2 ports detected
Oct 01 13:00:41 localhost kernel: usbcore: registered new interface driver usbserial_generic
Oct 01 13:00:41 localhost kernel: usbserial: USB Serial support registered for generic
Oct 01 13:00:41 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Oct 01 13:00:41 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Oct 01 13:00:41 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Oct 01 13:00:41 localhost kernel: mousedev: PS/2 mouse device common for all mice
Oct 01 13:00:41 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Oct 01 13:00:41 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Oct 01 13:00:41 localhost kernel: rtc_cmos 00:04: registered as rtc0
Oct 01 13:00:41 localhost kernel: rtc_cmos 00:04: setting system clock to 2025-10-01T13:00:40 UTC (1759323640)
Oct 01 13:00:41 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Oct 01 13:00:41 localhost kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Oct 01 13:00:41 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Oct 01 13:00:41 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Oct 01 13:00:41 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Oct 01 13:00:41 localhost kernel: usbcore: registered new interface driver usbhid
Oct 01 13:00:41 localhost kernel: usbhid: USB HID core driver
Oct 01 13:00:41 localhost kernel: drop_monitor: Initializing network drop monitor service
Oct 01 13:00:41 localhost kernel: Initializing XFRM netlink socket
Oct 01 13:00:41 localhost kernel: NET: Registered PF_INET6 protocol family
Oct 01 13:00:41 localhost kernel: Segment Routing with IPv6
Oct 01 13:00:41 localhost kernel: NET: Registered PF_PACKET protocol family
Oct 01 13:00:41 localhost kernel: mpls_gso: MPLS GSO support
Oct 01 13:00:41 localhost kernel: IPI shorthand broadcast: enabled
Oct 01 13:00:41 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Oct 01 13:00:41 localhost kernel: AES CTR mode by8 optimization enabled
Oct 01 13:00:41 localhost kernel: sched_clock: Marking stable (1134010440, 144426190)->(1392286880, -113850250)
Oct 01 13:00:41 localhost kernel: registered taskstats version 1
Oct 01 13:00:41 localhost kernel: Loading compiled-in X.509 certificates
Oct 01 13:00:41 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: bb2966091bafcba340f8183756023c985dcc8fe9'
Oct 01 13:00:41 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Oct 01 13:00:41 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Oct 01 13:00:41 localhost kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Oct 01 13:00:41 localhost kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Oct 01 13:00:41 localhost kernel: Demotion targets for Node 0: null
Oct 01 13:00:41 localhost kernel: page_owner is disabled
Oct 01 13:00:41 localhost kernel: Key type .fscrypt registered
Oct 01 13:00:41 localhost kernel: Key type fscrypt-provisioning registered
Oct 01 13:00:41 localhost kernel: Key type big_key registered
Oct 01 13:00:41 localhost kernel: Key type encrypted registered
Oct 01 13:00:41 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Oct 01 13:00:41 localhost kernel: Loading compiled-in module X.509 certificates
Oct 01 13:00:41 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: bb2966091bafcba340f8183756023c985dcc8fe9'
Oct 01 13:00:41 localhost kernel: ima: Allocated hash algorithm: sha256
Oct 01 13:00:41 localhost kernel: ima: No architecture policies found
Oct 01 13:00:41 localhost kernel: evm: Initialising EVM extended attributes:
Oct 01 13:00:41 localhost kernel: evm: security.selinux
Oct 01 13:00:41 localhost kernel: evm: security.SMACK64 (disabled)
Oct 01 13:00:41 localhost kernel: evm: security.SMACK64EXEC (disabled)
Oct 01 13:00:41 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Oct 01 13:00:41 localhost kernel: evm: security.SMACK64MMAP (disabled)
Oct 01 13:00:41 localhost kernel: evm: security.apparmor (disabled)
Oct 01 13:00:41 localhost kernel: evm: security.ima
Oct 01 13:00:41 localhost kernel: evm: security.capability
Oct 01 13:00:41 localhost kernel: evm: HMAC attrs: 0x1
Oct 01 13:00:41 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Oct 01 13:00:41 localhost kernel: Running certificate verification RSA selftest
Oct 01 13:00:41 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Oct 01 13:00:41 localhost kernel: Running certificate verification ECDSA selftest
Oct 01 13:00:41 localhost kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Oct 01 13:00:41 localhost kernel: clk: Disabling unused clocks
Oct 01 13:00:41 localhost kernel: Freeing unused decrypted memory: 2028K
Oct 01 13:00:41 localhost kernel: Freeing unused kernel image (initmem) memory: 4072K
Oct 01 13:00:41 localhost kernel: Write protecting the kernel read-only data: 30720k
Oct 01 13:00:41 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 348K
Oct 01 13:00:41 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Oct 01 13:00:41 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Oct 01 13:00:41 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Oct 01 13:00:41 localhost kernel: usb 1-1: Manufacturer: QEMU
Oct 01 13:00:41 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Oct 01 13:00:41 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Oct 01 13:00:41 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Oct 01 13:00:41 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Oct 01 13:00:41 localhost kernel: Run /init as init process
Oct 01 13:00:41 localhost kernel:   with arguments:
Oct 01 13:00:41 localhost kernel:     /init
Oct 01 13:00:41 localhost kernel:   with environment:
Oct 01 13:00:41 localhost kernel:     HOME=/
Oct 01 13:00:41 localhost kernel:     TERM=linux
Oct 01 13:00:41 localhost kernel:     BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-617.el9.x86_64
Oct 01 13:00:41 localhost systemd[1]: systemd 252-55.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Oct 01 13:00:41 localhost systemd[1]: Detected virtualization kvm.
Oct 01 13:00:41 localhost systemd[1]: Detected architecture x86-64.
Oct 01 13:00:41 localhost systemd[1]: Running in initrd.
Oct 01 13:00:41 localhost systemd[1]: No hostname configured, using default hostname.
Oct 01 13:00:41 localhost systemd[1]: Hostname set to <localhost>.
Oct 01 13:00:41 localhost systemd[1]: Initializing machine ID from VM UUID.
Oct 01 13:00:41 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Oct 01 13:00:41 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Oct 01 13:00:41 localhost systemd[1]: Reached target Local Encrypted Volumes.
Oct 01 13:00:41 localhost systemd[1]: Reached target Initrd /usr File System.
Oct 01 13:00:41 localhost systemd[1]: Reached target Local File Systems.
Oct 01 13:00:41 localhost systemd[1]: Reached target Path Units.
Oct 01 13:00:41 localhost systemd[1]: Reached target Slice Units.
Oct 01 13:00:41 localhost systemd[1]: Reached target Swaps.
Oct 01 13:00:41 localhost systemd[1]: Reached target Timer Units.
Oct 01 13:00:41 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Oct 01 13:00:41 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Oct 01 13:00:41 localhost systemd[1]: Listening on Journal Socket.
Oct 01 13:00:41 localhost systemd[1]: Listening on udev Control Socket.
Oct 01 13:00:41 localhost systemd[1]: Listening on udev Kernel Socket.
Oct 01 13:00:41 localhost systemd[1]: Reached target Socket Units.
Oct 01 13:00:41 localhost systemd[1]: Starting Create List of Static Device Nodes...
Oct 01 13:00:41 localhost systemd[1]: Starting Journal Service...
Oct 01 13:00:41 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Oct 01 13:00:41 localhost systemd[1]: Starting Apply Kernel Variables...
Oct 01 13:00:41 localhost systemd[1]: Starting Create System Users...
Oct 01 13:00:41 localhost systemd[1]: Starting Setup Virtual Console...
Oct 01 13:00:41 localhost systemd[1]: Finished Create List of Static Device Nodes.
Oct 01 13:00:41 localhost systemd[1]: Finished Apply Kernel Variables.
Oct 01 13:00:41 localhost systemd[1]: Finished Create System Users.
Oct 01 13:00:41 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Oct 01 13:00:41 localhost systemd-journald[308]: Journal started
Oct 01 13:00:41 localhost systemd-journald[308]: Runtime Journal (/run/log/journal/3609044e8b894b90b4b0db5bb7be664b) is 8.0M, max 153.5M, 145.5M free.
Oct 01 13:00:41 localhost systemd-sysusers[312]: Creating group 'users' with GID 100.
Oct 01 13:00:41 localhost systemd-sysusers[312]: Creating group 'dbus' with GID 81.
Oct 01 13:00:41 localhost systemd-sysusers[312]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Oct 01 13:00:41 localhost systemd[1]: Started Journal Service.
Oct 01 13:00:41 localhost systemd[1]: Starting Create Volatile Files and Directories...
Oct 01 13:00:41 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Oct 01 13:00:41 localhost systemd[1]: Finished Create Volatile Files and Directories.
Oct 01 13:00:41 localhost systemd[1]: Finished Setup Virtual Console.
Oct 01 13:00:41 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Oct 01 13:00:41 localhost systemd[1]: Starting dracut cmdline hook...
Oct 01 13:00:41 localhost dracut-cmdline[326]: dracut-9 dracut-057-102.git20250818.el9
Oct 01 13:00:41 localhost dracut-cmdline[326]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-617.el9.x86_64 root=UUID=d6a81468-b74c-4055-b485-def635ab40f8 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Oct 01 13:00:41 localhost systemd[1]: Finished dracut cmdline hook.
Oct 01 13:00:41 localhost systemd[1]: Starting dracut pre-udev hook...
Oct 01 13:00:41 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Oct 01 13:00:41 localhost kernel: device-mapper: uevent: version 1.0.3
Oct 01 13:00:41 localhost kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Oct 01 13:00:41 localhost kernel: RPC: Registered named UNIX socket transport module.
Oct 01 13:00:41 localhost kernel: RPC: Registered udp transport module.
Oct 01 13:00:41 localhost kernel: RPC: Registered tcp transport module.
Oct 01 13:00:41 localhost kernel: RPC: Registered tcp-with-tls transport module.
Oct 01 13:00:41 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Oct 01 13:00:42 localhost rpc.statd[442]: Version 2.5.4 starting
Oct 01 13:00:42 localhost rpc.statd[442]: Initializing NSM state
Oct 01 13:00:42 localhost rpc.idmapd[447]: Setting log level to 0
Oct 01 13:00:42 localhost systemd[1]: Finished dracut pre-udev hook.
Oct 01 13:00:42 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Oct 01 13:00:42 localhost systemd-udevd[460]: Using default interface naming scheme 'rhel-9.0'.
Oct 01 13:00:42 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Oct 01 13:00:42 localhost systemd[1]: Starting dracut pre-trigger hook...
Oct 01 13:00:42 localhost systemd[1]: Finished dracut pre-trigger hook.
Oct 01 13:00:42 localhost systemd[1]: Starting Coldplug All udev Devices...
Oct 01 13:00:42 localhost systemd[1]: Created slice Slice /system/modprobe.
Oct 01 13:00:42 localhost systemd[1]: Starting Load Kernel Module configfs...
Oct 01 13:00:42 localhost systemd[1]: Finished Coldplug All udev Devices.
Oct 01 13:00:42 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Oct 01 13:00:42 localhost systemd[1]: Finished Load Kernel Module configfs.
Oct 01 13:00:42 localhost systemd[1]: Mounting Kernel Configuration File System...
Oct 01 13:00:42 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Oct 01 13:00:42 localhost systemd[1]: Reached target Network.
Oct 01 13:00:42 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Oct 01 13:00:42 localhost systemd[1]: Starting dracut initqueue hook...
Oct 01 13:00:42 localhost systemd[1]: Mounted Kernel Configuration File System.
Oct 01 13:00:42 localhost systemd[1]: Reached target System Initialization.
Oct 01 13:00:42 localhost systemd[1]: Reached target Basic System.
Oct 01 13:00:42 localhost kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Oct 01 13:00:42 localhost kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Oct 01 13:00:42 localhost kernel:  vda: vda1
Oct 01 13:00:42 localhost kernel: libata version 3.00 loaded.
Oct 01 13:00:42 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Oct 01 13:00:42 localhost kernel: scsi host0: ata_piix
Oct 01 13:00:42 localhost kernel: scsi host1: ata_piix
Oct 01 13:00:42 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Oct 01 13:00:42 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Oct 01 13:00:42 localhost systemd-udevd[493]: Network interface NamePolicy= disabled on kernel command line.
Oct 01 13:00:42 localhost systemd[1]: Found device /dev/disk/by-uuid/d6a81468-b74c-4055-b485-def635ab40f8.
Oct 01 13:00:42 localhost systemd[1]: Reached target Initrd Root Device.
Oct 01 13:00:42 localhost kernel: ata1: found unknown device (class 0)
Oct 01 13:00:42 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Oct 01 13:00:42 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Oct 01 13:00:42 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Oct 01 13:00:42 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Oct 01 13:00:42 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Oct 01 13:00:42 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Oct 01 13:00:42 localhost systemd[1]: Finished dracut initqueue hook.
Oct 01 13:00:42 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Oct 01 13:00:42 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Oct 01 13:00:42 localhost systemd[1]: Reached target Remote File Systems.
Oct 01 13:00:42 localhost systemd[1]: Starting dracut pre-mount hook...
Oct 01 13:00:42 localhost systemd[1]: Finished dracut pre-mount hook.
Oct 01 13:00:42 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/d6a81468-b74c-4055-b485-def635ab40f8...
Oct 01 13:00:42 localhost systemd-fsck[557]: /usr/sbin/fsck.xfs: XFS file system.
Oct 01 13:00:42 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/d6a81468-b74c-4055-b485-def635ab40f8.
Oct 01 13:00:42 localhost systemd[1]: Mounting /sysroot...
Oct 01 13:00:43 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Oct 01 13:00:43 localhost kernel: XFS (vda1): Mounting V5 Filesystem d6a81468-b74c-4055-b485-def635ab40f8
Oct 01 13:00:43 localhost kernel: XFS (vda1): Ending clean mount
Oct 01 13:00:43 localhost systemd[1]: Mounted /sysroot.
Oct 01 13:00:43 localhost systemd[1]: Reached target Initrd Root File System.
Oct 01 13:00:43 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Oct 01 13:00:43 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Oct 01 13:00:43 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Oct 01 13:00:43 localhost systemd[1]: Reached target Initrd File Systems.
Oct 01 13:00:43 localhost systemd[1]: Reached target Initrd Default Target.
Oct 01 13:00:43 localhost systemd[1]: Starting dracut mount hook...
Oct 01 13:00:43 localhost systemd[1]: Finished dracut mount hook.
Oct 01 13:00:43 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Oct 01 13:00:43 localhost rpc.idmapd[447]: exiting on signal 15
Oct 01 13:00:43 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Oct 01 13:00:43 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Oct 01 13:00:43 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Oct 01 13:00:43 localhost systemd[1]: Stopped target Network.
Oct 01 13:00:43 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Oct 01 13:00:43 localhost systemd[1]: Stopped target Timer Units.
Oct 01 13:00:43 localhost systemd[1]: dbus.socket: Deactivated successfully.
Oct 01 13:00:43 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Oct 01 13:00:43 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Oct 01 13:00:43 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Oct 01 13:00:43 localhost systemd[1]: Stopped target Initrd Default Target.
Oct 01 13:00:43 localhost systemd[1]: Stopped target Basic System.
Oct 01 13:00:43 localhost systemd[1]: Stopped target Initrd Root Device.
Oct 01 13:00:43 localhost systemd[1]: Stopped target Initrd /usr File System.
Oct 01 13:00:43 localhost systemd[1]: Stopped target Path Units.
Oct 01 13:00:43 localhost systemd[1]: Stopped target Remote File Systems.
Oct 01 13:00:43 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Oct 01 13:00:43 localhost systemd[1]: Stopped target Slice Units.
Oct 01 13:00:43 localhost systemd[1]: Stopped target Socket Units.
Oct 01 13:00:43 localhost systemd[1]: Stopped target System Initialization.
Oct 01 13:00:43 localhost systemd[1]: Stopped target Local File Systems.
Oct 01 13:00:43 localhost systemd[1]: Stopped target Swaps.
Oct 01 13:00:43 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Oct 01 13:00:43 localhost systemd[1]: Stopped dracut mount hook.
Oct 01 13:00:43 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Oct 01 13:00:43 localhost systemd[1]: Stopped dracut pre-mount hook.
Oct 01 13:00:43 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Oct 01 13:00:43 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Oct 01 13:00:43 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Oct 01 13:00:43 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Oct 01 13:00:43 localhost systemd[1]: Stopped dracut initqueue hook.
Oct 01 13:00:43 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Oct 01 13:00:43 localhost systemd[1]: Stopped Apply Kernel Variables.
Oct 01 13:00:43 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Oct 01 13:00:43 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Oct 01 13:00:43 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Oct 01 13:00:43 localhost systemd[1]: Stopped Coldplug All udev Devices.
Oct 01 13:00:43 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Oct 01 13:00:43 localhost systemd[1]: Stopped dracut pre-trigger hook.
Oct 01 13:00:43 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Oct 01 13:00:43 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Oct 01 13:00:43 localhost systemd[1]: Stopped Setup Virtual Console.
Oct 01 13:00:43 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Oct 01 13:00:43 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Oct 01 13:00:43 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Oct 01 13:00:43 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Oct 01 13:00:43 localhost systemd[1]: systemd-udevd.service: Consumed 1.050s CPU time.
Oct 01 13:00:43 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Oct 01 13:00:43 localhost systemd[1]: Closed udev Control Socket.
Oct 01 13:00:43 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Oct 01 13:00:43 localhost systemd[1]: Closed udev Kernel Socket.
Oct 01 13:00:43 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Oct 01 13:00:43 localhost systemd[1]: Stopped dracut pre-udev hook.
Oct 01 13:00:43 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Oct 01 13:00:43 localhost systemd[1]: Stopped dracut cmdline hook.
Oct 01 13:00:43 localhost systemd[1]: Starting Cleanup udev Database...
Oct 01 13:00:43 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Oct 01 13:00:43 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Oct 01 13:00:43 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Oct 01 13:00:43 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Oct 01 13:00:43 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Oct 01 13:00:43 localhost systemd[1]: Stopped Create System Users.
Oct 01 13:00:43 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Oct 01 13:00:43 localhost systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Oct 01 13:00:43 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Oct 01 13:00:43 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Oct 01 13:00:43 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Oct 01 13:00:43 localhost systemd[1]: Finished Cleanup udev Database.
Oct 01 13:00:43 localhost systemd[1]: Reached target Switch Root.
Oct 01 13:00:43 localhost systemd[1]: Starting Switch Root...
Oct 01 13:00:43 localhost systemd[1]: Switching root.
Oct 01 13:00:43 localhost systemd-journald[308]: Journal stopped
Oct 01 13:00:45 localhost systemd-journald[308]: Received SIGTERM from PID 1 (systemd).
Oct 01 13:00:45 localhost kernel: audit: type=1404 audit(1759323644.182:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Oct 01 13:00:45 localhost kernel: SELinux:  policy capability network_peer_controls=1
Oct 01 13:00:45 localhost kernel: SELinux:  policy capability open_perms=1
Oct 01 13:00:45 localhost kernel: SELinux:  policy capability extended_socket_class=1
Oct 01 13:00:45 localhost kernel: SELinux:  policy capability always_check_network=0
Oct 01 13:00:45 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 01 13:00:45 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 01 13:00:45 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 01 13:00:45 localhost kernel: audit: type=1403 audit(1759323644.332:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Oct 01 13:00:45 localhost systemd[1]: Successfully loaded SELinux policy in 158.083ms.
Oct 01 13:00:45 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 30.036ms.
Oct 01 13:00:45 localhost systemd[1]: systemd 252-55.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Oct 01 13:00:45 localhost systemd[1]: Detected virtualization kvm.
Oct 01 13:00:45 localhost systemd[1]: Detected architecture x86-64.
Oct 01 13:00:45 localhost systemd-rc-local-generator[638]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:00:45 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Oct 01 13:00:45 localhost systemd[1]: Stopped Switch Root.
Oct 01 13:00:45 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Oct 01 13:00:45 localhost systemd[1]: Created slice Slice /system/getty.
Oct 01 13:00:45 localhost systemd[1]: Created slice Slice /system/serial-getty.
Oct 01 13:00:45 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Oct 01 13:00:45 localhost systemd[1]: Created slice User and Session Slice.
Oct 01 13:00:45 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Oct 01 13:00:45 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Oct 01 13:00:45 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Oct 01 13:00:45 localhost systemd[1]: Reached target Local Encrypted Volumes.
Oct 01 13:00:45 localhost systemd[1]: Stopped target Switch Root.
Oct 01 13:00:45 localhost systemd[1]: Stopped target Initrd File Systems.
Oct 01 13:00:45 localhost systemd[1]: Stopped target Initrd Root File System.
Oct 01 13:00:45 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Oct 01 13:00:45 localhost systemd[1]: Reached target Path Units.
Oct 01 13:00:45 localhost systemd[1]: Reached target rpc_pipefs.target.
Oct 01 13:00:45 localhost systemd[1]: Reached target Slice Units.
Oct 01 13:00:45 localhost systemd[1]: Reached target Swaps.
Oct 01 13:00:45 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Oct 01 13:00:45 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Oct 01 13:00:45 localhost systemd[1]: Reached target RPC Port Mapper.
Oct 01 13:00:45 localhost systemd[1]: Listening on Process Core Dump Socket.
Oct 01 13:00:45 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Oct 01 13:00:45 localhost systemd[1]: Listening on udev Control Socket.
Oct 01 13:00:45 localhost systemd[1]: Listening on udev Kernel Socket.
Oct 01 13:00:45 localhost systemd[1]: Mounting Huge Pages File System...
Oct 01 13:00:45 localhost systemd[1]: Mounting POSIX Message Queue File System...
Oct 01 13:00:45 localhost systemd[1]: Mounting Kernel Debug File System...
Oct 01 13:00:45 localhost systemd[1]: Mounting Kernel Trace File System...
Oct 01 13:00:45 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Oct 01 13:00:45 localhost systemd[1]: Starting Create List of Static Device Nodes...
Oct 01 13:00:45 localhost systemd[1]: Starting Load Kernel Module configfs...
Oct 01 13:00:45 localhost systemd[1]: Starting Load Kernel Module drm...
Oct 01 13:00:45 localhost systemd[1]: Starting Load Kernel Module efi_pstore...
Oct 01 13:00:45 localhost systemd[1]: Starting Load Kernel Module fuse...
Oct 01 13:00:45 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Oct 01 13:00:45 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Oct 01 13:00:45 localhost systemd[1]: Stopped File System Check on Root Device.
Oct 01 13:00:45 localhost systemd[1]: Stopped Journal Service.
Oct 01 13:00:45 localhost kernel: fuse: init (API version 7.37)
Oct 01 13:00:45 localhost systemd[1]: Starting Journal Service...
Oct 01 13:00:45 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Oct 01 13:00:45 localhost systemd[1]: Starting Generate network units from Kernel command line...
Oct 01 13:00:45 localhost systemd[1]: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Oct 01 13:00:45 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Oct 01 13:00:45 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Oct 01 13:00:45 localhost systemd[1]: Starting Apply Kernel Variables...
Oct 01 13:00:45 localhost systemd[1]: Starting Coldplug All udev Devices...
Oct 01 13:00:45 localhost kernel: ACPI: bus type drm_connector registered
Oct 01 13:00:45 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Oct 01 13:00:45 localhost systemd[1]: Mounted Huge Pages File System.
Oct 01 13:00:45 localhost systemd-journald[679]: Journal started
Oct 01 13:00:45 localhost systemd-journald[679]: Runtime Journal (/run/log/journal/21983c68f36a73745cc172a394ebc51d) is 8.0M, max 153.5M, 145.5M free.
Oct 01 13:00:45 localhost systemd[1]: Mounted POSIX Message Queue File System.
Oct 01 13:00:44 localhost systemd[1]: Queued start job for default target Multi-User System.
Oct 01 13:00:44 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Oct 01 13:00:45 localhost systemd[1]: Started Journal Service.
Oct 01 13:00:45 localhost systemd[1]: Mounted Kernel Debug File System.
Oct 01 13:00:45 localhost systemd[1]: Mounted Kernel Trace File System.
Oct 01 13:00:45 localhost systemd[1]: Finished Create List of Static Device Nodes.
Oct 01 13:00:45 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Oct 01 13:00:45 localhost systemd[1]: Finished Load Kernel Module configfs.
Oct 01 13:00:45 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Oct 01 13:00:45 localhost systemd[1]: Finished Load Kernel Module drm.
Oct 01 13:00:45 localhost systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Oct 01 13:00:45 localhost systemd[1]: Finished Load Kernel Module efi_pstore.
Oct 01 13:00:45 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Oct 01 13:00:45 localhost systemd[1]: Finished Load Kernel Module fuse.
Oct 01 13:00:45 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Oct 01 13:00:45 localhost systemd[1]: Finished Generate network units from Kernel command line.
Oct 01 13:00:45 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Oct 01 13:00:45 localhost systemd[1]: Finished Apply Kernel Variables.
Oct 01 13:00:45 localhost systemd[1]: Mounting FUSE Control File System...
Oct 01 13:00:45 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Oct 01 13:00:45 localhost systemd[1]: Starting Rebuild Hardware Database...
Oct 01 13:00:45 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Oct 01 13:00:45 localhost systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Oct 01 13:00:45 localhost systemd[1]: Starting Load/Save OS Random Seed...
Oct 01 13:00:45 localhost systemd[1]: Starting Create System Users...
Oct 01 13:00:45 localhost systemd[1]: Mounted FUSE Control File System.
Oct 01 13:00:45 localhost systemd-journald[679]: Runtime Journal (/run/log/journal/21983c68f36a73745cc172a394ebc51d) is 8.0M, max 153.5M, 145.5M free.
Oct 01 13:00:45 localhost systemd-journald[679]: Received client request to flush runtime journal.
Oct 01 13:00:45 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Oct 01 13:00:45 localhost systemd[1]: Finished Load/Save OS Random Seed.
Oct 01 13:00:45 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Oct 01 13:00:45 localhost systemd[1]: Finished Create System Users.
Oct 01 13:00:45 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Oct 01 13:00:45 localhost systemd[1]: Finished Coldplug All udev Devices.
Oct 01 13:00:45 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Oct 01 13:00:45 localhost systemd[1]: Reached target Preparation for Local File Systems.
Oct 01 13:00:45 localhost systemd[1]: Reached target Local File Systems.
Oct 01 13:00:45 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Oct 01 13:00:45 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Oct 01 13:00:45 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Oct 01 13:00:45 localhost systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Oct 01 13:00:45 localhost systemd[1]: Starting Automatic Boot Loader Update...
Oct 01 13:00:45 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Oct 01 13:00:45 localhost systemd[1]: Starting Create Volatile Files and Directories...
Oct 01 13:00:45 localhost bootctl[697]: Couldn't find EFI system partition, skipping.
Oct 01 13:00:45 localhost systemd[1]: Finished Automatic Boot Loader Update.
Oct 01 13:00:45 localhost systemd[1]: Finished Create Volatile Files and Directories.
Oct 01 13:00:45 localhost systemd[1]: Starting Security Auditing Service...
Oct 01 13:00:45 localhost systemd[1]: Starting RPC Bind...
Oct 01 13:00:45 localhost systemd[1]: Starting Rebuild Journal Catalog...
Oct 01 13:00:45 localhost auditd[703]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Oct 01 13:00:45 localhost auditd[703]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Oct 01 13:00:45 localhost systemd[1]: Started RPC Bind.
Oct 01 13:00:45 localhost systemd[1]: Finished Rebuild Journal Catalog.
Oct 01 13:00:45 localhost augenrules[708]: /sbin/augenrules: No change
Oct 01 13:00:45 localhost augenrules[723]: No rules
Oct 01 13:00:45 localhost augenrules[723]: enabled 1
Oct 01 13:00:45 localhost augenrules[723]: failure 1
Oct 01 13:00:45 localhost augenrules[723]: pid 703
Oct 01 13:00:45 localhost augenrules[723]: rate_limit 0
Oct 01 13:00:45 localhost augenrules[723]: backlog_limit 8192
Oct 01 13:00:45 localhost augenrules[723]: lost 0
Oct 01 13:00:45 localhost augenrules[723]: backlog 0
Oct 01 13:00:45 localhost augenrules[723]: backlog_wait_time 60000
Oct 01 13:00:45 localhost augenrules[723]: backlog_wait_time_actual 0
Oct 01 13:00:45 localhost augenrules[723]: enabled 1
Oct 01 13:00:45 localhost augenrules[723]: failure 1
Oct 01 13:00:45 localhost augenrules[723]: pid 703
Oct 01 13:00:45 localhost augenrules[723]: rate_limit 0
Oct 01 13:00:45 localhost augenrules[723]: backlog_limit 8192
Oct 01 13:00:45 localhost augenrules[723]: lost 0
Oct 01 13:00:45 localhost augenrules[723]: backlog 4
Oct 01 13:00:45 localhost augenrules[723]: backlog_wait_time 60000
Oct 01 13:00:45 localhost augenrules[723]: backlog_wait_time_actual 0
Oct 01 13:00:45 localhost augenrules[723]: enabled 1
Oct 01 13:00:45 localhost augenrules[723]: failure 1
Oct 01 13:00:45 localhost augenrules[723]: pid 703
Oct 01 13:00:45 localhost augenrules[723]: rate_limit 0
Oct 01 13:00:45 localhost augenrules[723]: backlog_limit 8192
Oct 01 13:00:45 localhost augenrules[723]: lost 0
Oct 01 13:00:45 localhost augenrules[723]: backlog 0
Oct 01 13:00:45 localhost augenrules[723]: backlog_wait_time 60000
Oct 01 13:00:45 localhost augenrules[723]: backlog_wait_time_actual 0
Oct 01 13:00:45 localhost systemd[1]: Started Security Auditing Service.
Oct 01 13:00:45 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Oct 01 13:00:45 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Oct 01 13:00:45 localhost systemd[1]: Finished Rebuild Hardware Database.
Oct 01 13:00:45 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Oct 01 13:00:45 localhost systemd-udevd[732]: Using default interface naming scheme 'rhel-9.0'.
Oct 01 13:00:45 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Oct 01 13:00:45 localhost systemd[1]: Starting Update is Completed...
Oct 01 13:00:45 localhost systemd[1]: Finished Update is Completed.
Oct 01 13:00:45 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Oct 01 13:00:45 localhost systemd[1]: Reached target System Initialization.
Oct 01 13:00:45 localhost systemd[1]: Started dnf makecache --timer.
Oct 01 13:00:45 localhost systemd[1]: Started Daily rotation of log files.
Oct 01 13:00:45 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Oct 01 13:00:45 localhost systemd[1]: Reached target Timer Units.
Oct 01 13:00:45 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Oct 01 13:00:45 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Oct 01 13:00:45 localhost systemd[1]: Reached target Socket Units.
Oct 01 13:00:45 localhost systemd[1]: Starting D-Bus System Message Bus...
Oct 01 13:00:45 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Oct 01 13:00:45 localhost systemd[1]: Starting Load Kernel Module configfs...
Oct 01 13:00:45 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Oct 01 13:00:45 localhost systemd[1]: Finished Load Kernel Module configfs.
Oct 01 13:00:45 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Oct 01 13:00:45 localhost systemd[1]: Started D-Bus System Message Bus.
Oct 01 13:00:45 localhost systemd[1]: Reached target Basic System.
Oct 01 13:00:45 localhost dbus-broker-lau[748]: Ready
Oct 01 13:00:45 localhost systemd-udevd[744]: Network interface NamePolicy= disabled on kernel command line.
Oct 01 13:00:45 localhost systemd[1]: Starting NTP client/server...
Oct 01 13:00:45 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Oct 01 13:00:45 localhost systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Oct 01 13:00:46 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Oct 01 13:00:46 localhost kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Oct 01 13:00:46 localhost kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Oct 01 13:00:46 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Oct 01 13:00:46 localhost systemd[1]: Starting IPv4 firewall with iptables...
Oct 01 13:00:46 localhost systemd[1]: Started irqbalance daemon.
Oct 01 13:00:46 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Oct 01 13:00:46 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct 01 13:00:46 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct 01 13:00:46 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct 01 13:00:46 localhost systemd[1]: Reached target sshd-keygen.target.
Oct 01 13:00:46 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Oct 01 13:00:46 localhost systemd[1]: Reached target User and Group Name Lookups.
Oct 01 13:00:46 localhost chronyd[793]: chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Oct 01 13:00:46 localhost chronyd[793]: Loaded 0 symmetric keys
Oct 01 13:00:46 localhost chronyd[793]: Using right/UTC timezone to obtain leap second data
Oct 01 13:00:46 localhost chronyd[793]: Loaded seccomp filter (level 2)
Oct 01 13:00:46 localhost systemd[1]: Starting User Login Management...
Oct 01 13:00:46 localhost systemd[1]: Started NTP client/server.
Oct 01 13:00:46 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Oct 01 13:00:46 localhost systemd-logind[791]: Watching system buttons on /dev/input/event0 (Power Button)
Oct 01 13:00:46 localhost systemd-logind[791]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Oct 01 13:00:46 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Oct 01 13:00:46 localhost kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Oct 01 13:00:46 localhost systemd-logind[791]: New seat seat0.
Oct 01 13:00:46 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Oct 01 13:00:46 localhost systemd[1]: Started User Login Management.
Oct 01 13:00:46 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Oct 01 13:00:46 localhost kernel: Console: switching to colour dummy device 80x25
Oct 01 13:00:46 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Oct 01 13:00:46 localhost kernel: [drm] features: -context_init
Oct 01 13:00:46 localhost kernel: [drm] number of scanouts: 1
Oct 01 13:00:46 localhost kernel: [drm] number of cap sets: 0
Oct 01 13:00:46 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Oct 01 13:00:46 localhost kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Oct 01 13:00:46 localhost kernel: Console: switching to colour frame buffer device 128x48
Oct 01 13:00:46 localhost kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Oct 01 13:00:46 localhost kernel: kvm_amd: TSC scaling supported
Oct 01 13:00:46 localhost kernel: kvm_amd: Nested Virtualization enabled
Oct 01 13:00:46 localhost kernel: kvm_amd: Nested Paging enabled
Oct 01 13:00:46 localhost kernel: kvm_amd: LBR virtualization supported
Oct 01 13:00:46 localhost iptables.init[781]: iptables: Applying firewall rules: [  OK  ]
Oct 01 13:00:46 localhost systemd[1]: Finished IPv4 firewall with iptables.
Oct 01 13:00:46 localhost cloud-init[842]: Cloud-init v. 24.4-7.el9 running 'init-local' at Wed, 01 Oct 2025 13:00:46 +0000. Up 7.31 seconds.
Oct 01 13:00:46 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Oct 01 13:00:46 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Oct 01 13:00:46 localhost systemd[1]: run-cloud\x2dinit-tmp-tmpo62b1efz.mount: Deactivated successfully.
Oct 01 13:00:47 localhost systemd[1]: Starting Hostname Service...
Oct 01 13:00:47 localhost systemd[1]: Started Hostname Service.
Oct 01 13:00:47 np0005464591.novalocal systemd-hostnamed[856]: Hostname set to <np0005464591.novalocal> (static)
Oct 01 13:00:47 np0005464591.novalocal systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Oct 01 13:00:47 np0005464591.novalocal systemd[1]: Reached target Preparation for Network.
Oct 01 13:00:47 np0005464591.novalocal systemd[1]: Starting Network Manager...
Oct 01 13:00:47 np0005464591.novalocal NetworkManager[860]: <info>  [1759323647.4438] NetworkManager (version 1.54.1-1.el9) is starting... (boot:1fa3a04c-5158-4a26-9fa0-b8b34ae08d38)
Oct 01 13:00:47 np0005464591.novalocal NetworkManager[860]: <info>  [1759323647.4446] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Oct 01 13:00:47 np0005464591.novalocal NetworkManager[860]: <info>  [1759323647.4613] manager[0x563109734080]: monitoring kernel firmware directory '/lib/firmware'.
Oct 01 13:00:47 np0005464591.novalocal NetworkManager[860]: <info>  [1759323647.4684] hostname: hostname: using hostnamed
Oct 01 13:00:47 np0005464591.novalocal NetworkManager[860]: <info>  [1759323647.4686] hostname: static hostname changed from (none) to "np0005464591.novalocal"
Oct 01 13:00:47 np0005464591.novalocal NetworkManager[860]: <info>  [1759323647.4693] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Oct 01 13:00:47 np0005464591.novalocal NetworkManager[860]: <info>  [1759323647.4878] manager[0x563109734080]: rfkill: Wi-Fi hardware radio set enabled
Oct 01 13:00:47 np0005464591.novalocal NetworkManager[860]: <info>  [1759323647.4878] manager[0x563109734080]: rfkill: WWAN hardware radio set enabled
Oct 01 13:00:47 np0005464591.novalocal systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Oct 01 13:00:47 np0005464591.novalocal NetworkManager[860]: <info>  [1759323647.4966] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Oct 01 13:00:47 np0005464591.novalocal NetworkManager[860]: <info>  [1759323647.4966] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Oct 01 13:00:47 np0005464591.novalocal NetworkManager[860]: <info>  [1759323647.4967] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Oct 01 13:00:47 np0005464591.novalocal NetworkManager[860]: <info>  [1759323647.4967] manager: Networking is enabled by state file
Oct 01 13:00:47 np0005464591.novalocal NetworkManager[860]: <info>  [1759323647.4969] settings: Loaded settings plugin: keyfile (internal)
Oct 01 13:00:47 np0005464591.novalocal NetworkManager[860]: <info>  [1759323647.5004] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Oct 01 13:00:47 np0005464591.novalocal NetworkManager[860]: <info>  [1759323647.5036] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Oct 01 13:00:47 np0005464591.novalocal NetworkManager[860]: <info>  [1759323647.5062] dhcp: init: Using DHCP client 'internal'
Oct 01 13:00:47 np0005464591.novalocal NetworkManager[860]: <info>  [1759323647.5065] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Oct 01 13:00:47 np0005464591.novalocal NetworkManager[860]: <info>  [1759323647.5083] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 01 13:00:47 np0005464591.novalocal NetworkManager[860]: <info>  [1759323647.5102] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Oct 01 13:00:47 np0005464591.novalocal NetworkManager[860]: <info>  [1759323647.5112] device (lo): Activation: starting connection 'lo' (d1361516-740f-4fdb-ad0c-6174cd593c78)
Oct 01 13:00:47 np0005464591.novalocal NetworkManager[860]: <info>  [1759323647.5124] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Oct 01 13:00:47 np0005464591.novalocal NetworkManager[860]: <info>  [1759323647.5128] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 01 13:00:47 np0005464591.novalocal NetworkManager[860]: <info>  [1759323647.5164] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Oct 01 13:00:47 np0005464591.novalocal NetworkManager[860]: <info>  [1759323647.5169] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Oct 01 13:00:47 np0005464591.novalocal NetworkManager[860]: <info>  [1759323647.5171] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Oct 01 13:00:47 np0005464591.novalocal NetworkManager[860]: <info>  [1759323647.5173] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Oct 01 13:00:47 np0005464591.novalocal NetworkManager[860]: <info>  [1759323647.5175] device (eth0): carrier: link connected
Oct 01 13:00:47 np0005464591.novalocal NetworkManager[860]: <info>  [1759323647.5178] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Oct 01 13:00:47 np0005464591.novalocal NetworkManager[860]: <info>  [1759323647.5186] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Oct 01 13:00:47 np0005464591.novalocal NetworkManager[860]: <info>  [1759323647.5195] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Oct 01 13:00:47 np0005464591.novalocal NetworkManager[860]: <info>  [1759323647.5201] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Oct 01 13:00:47 np0005464591.novalocal NetworkManager[860]: <info>  [1759323647.5202] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 01 13:00:47 np0005464591.novalocal NetworkManager[860]: <info>  [1759323647.5205] manager: NetworkManager state is now CONNECTING
Oct 01 13:00:47 np0005464591.novalocal NetworkManager[860]: <info>  [1759323647.5207] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 01 13:00:47 np0005464591.novalocal NetworkManager[860]: <info>  [1759323647.5214] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 01 13:00:47 np0005464591.novalocal NetworkManager[860]: <info>  [1759323647.5218] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct 01 13:00:47 np0005464591.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct 01 13:00:47 np0005464591.novalocal systemd[1]: Started Network Manager.
Oct 01 13:00:47 np0005464591.novalocal systemd[1]: Reached target Network.
Oct 01 13:00:47 np0005464591.novalocal systemd[1]: Starting Network Manager Wait Online...
Oct 01 13:00:47 np0005464591.novalocal systemd[1]: Starting GSSAPI Proxy Daemon...
Oct 01 13:00:47 np0005464591.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Oct 01 13:00:47 np0005464591.novalocal NetworkManager[860]: <info>  [1759323647.5620] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Oct 01 13:00:47 np0005464591.novalocal NetworkManager[860]: <info>  [1759323647.5622] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Oct 01 13:00:47 np0005464591.novalocal NetworkManager[860]: <info>  [1759323647.5630] device (lo): Activation: successful, device activated.
Oct 01 13:00:47 np0005464591.novalocal systemd[1]: Started GSSAPI Proxy Daemon.
Oct 01 13:00:47 np0005464591.novalocal systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Oct 01 13:00:47 np0005464591.novalocal systemd[1]: Reached target NFS client services.
Oct 01 13:00:47 np0005464591.novalocal systemd[1]: Reached target Preparation for Remote File Systems.
Oct 01 13:00:47 np0005464591.novalocal systemd[1]: Reached target Remote File Systems.
Oct 01 13:00:47 np0005464591.novalocal systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Oct 01 13:00:48 np0005464591.novalocal NetworkManager[860]: <info>  [1759323648.5030] dhcp4 (eth0): state changed new lease, address=38.102.83.163
Oct 01 13:00:48 np0005464591.novalocal NetworkManager[860]: <info>  [1759323648.5049] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Oct 01 13:00:48 np0005464591.novalocal NetworkManager[860]: <info>  [1759323648.5084] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 01 13:00:48 np0005464591.novalocal NetworkManager[860]: <info>  [1759323648.5122] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 01 13:00:48 np0005464591.novalocal NetworkManager[860]: <info>  [1759323648.5129] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 01 13:00:48 np0005464591.novalocal NetworkManager[860]: <info>  [1759323648.5138] manager: NetworkManager state is now CONNECTED_SITE
Oct 01 13:00:48 np0005464591.novalocal NetworkManager[860]: <info>  [1759323648.5148] device (eth0): Activation: successful, device activated.
Oct 01 13:00:48 np0005464591.novalocal NetworkManager[860]: <info>  [1759323648.5157] manager: NetworkManager state is now CONNECTED_GLOBAL
Oct 01 13:00:48 np0005464591.novalocal NetworkManager[860]: <info>  [1759323648.5166] manager: startup complete
Oct 01 13:00:48 np0005464591.novalocal systemd[1]: Finished Network Manager Wait Online.
Oct 01 13:00:48 np0005464591.novalocal systemd[1]: Starting Cloud-init: Network Stage...
Oct 01 13:00:48 np0005464591.novalocal cloud-init[923]: Cloud-init v. 24.4-7.el9 running 'init' at Wed, 01 Oct 2025 13:00:48 +0000. Up 9.45 seconds.
Oct 01 13:00:48 np0005464591.novalocal cloud-init[923]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Oct 01 13:00:48 np0005464591.novalocal cloud-init[923]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Oct 01 13:00:48 np0005464591.novalocal cloud-init[923]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Oct 01 13:00:48 np0005464591.novalocal cloud-init[923]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Oct 01 13:00:48 np0005464591.novalocal cloud-init[923]: ci-info: |  eth0  | True |        38.102.83.163         | 255.255.255.0 | global | fa:16:3e:94:a5:35 |
Oct 01 13:00:48 np0005464591.novalocal cloud-init[923]: ci-info: |  eth0  | True | fe80::f816:3eff:fe94:a535/64 |       .       |  link  | fa:16:3e:94:a5:35 |
Oct 01 13:00:48 np0005464591.novalocal cloud-init[923]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Oct 01 13:00:48 np0005464591.novalocal cloud-init[923]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Oct 01 13:00:48 np0005464591.novalocal cloud-init[923]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Oct 01 13:00:48 np0005464591.novalocal cloud-init[923]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Oct 01 13:00:48 np0005464591.novalocal cloud-init[923]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Oct 01 13:00:48 np0005464591.novalocal cloud-init[923]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Oct 01 13:00:48 np0005464591.novalocal cloud-init[923]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Oct 01 13:00:48 np0005464591.novalocal cloud-init[923]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Oct 01 13:00:48 np0005464591.novalocal cloud-init[923]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Oct 01 13:00:48 np0005464591.novalocal cloud-init[923]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Oct 01 13:00:48 np0005464591.novalocal cloud-init[923]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Oct 01 13:00:48 np0005464591.novalocal cloud-init[923]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Oct 01 13:00:48 np0005464591.novalocal cloud-init[923]: ci-info: +-------+-------------+---------+-----------+-------+
Oct 01 13:00:48 np0005464591.novalocal cloud-init[923]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Oct 01 13:00:48 np0005464591.novalocal cloud-init[923]: ci-info: +-------+-------------+---------+-----------+-------+
Oct 01 13:00:48 np0005464591.novalocal cloud-init[923]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Oct 01 13:00:48 np0005464591.novalocal cloud-init[923]: ci-info: |   3   |    local    |    ::   |    eth0   |   U   |
Oct 01 13:00:48 np0005464591.novalocal cloud-init[923]: ci-info: |   4   |  multicast  |    ::   |    eth0   |   U   |
Oct 01 13:00:48 np0005464591.novalocal cloud-init[923]: ci-info: +-------+-------------+---------+-----------+-------+
Oct 01 13:00:49 np0005464591.novalocal useradd[990]: new group: name=cloud-user, GID=1001
Oct 01 13:00:49 np0005464591.novalocal useradd[990]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Oct 01 13:00:49 np0005464591.novalocal useradd[990]: add 'cloud-user' to group 'adm'
Oct 01 13:00:49 np0005464591.novalocal useradd[990]: add 'cloud-user' to group 'systemd-journal'
Oct 01 13:00:49 np0005464591.novalocal useradd[990]: add 'cloud-user' to shadow group 'adm'
Oct 01 13:00:49 np0005464591.novalocal useradd[990]: add 'cloud-user' to shadow group 'systemd-journal'
Oct 01 13:00:50 np0005464591.novalocal cloud-init[923]: Generating public/private rsa key pair.
Oct 01 13:00:50 np0005464591.novalocal cloud-init[923]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Oct 01 13:00:50 np0005464591.novalocal cloud-init[923]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Oct 01 13:00:50 np0005464591.novalocal cloud-init[923]: The key fingerprint is:
Oct 01 13:00:50 np0005464591.novalocal cloud-init[923]: SHA256:Lu534C+RTWyx275bZoUu8UORgH4W9z1celookU4kAfc root@np0005464591.novalocal
Oct 01 13:00:50 np0005464591.novalocal cloud-init[923]: The key's randomart image is:
Oct 01 13:00:50 np0005464591.novalocal cloud-init[923]: +---[RSA 3072]----+
Oct 01 13:00:50 np0005464591.novalocal cloud-init[923]: |        ..++o.   |
Oct 01 13:00:50 np0005464591.novalocal cloud-init[923]: |         .oo=....|
Oct 01 13:00:50 np0005464591.novalocal cloud-init[923]: |         o =E+++o|
Oct 01 13:00:50 np0005464591.novalocal cloud-init[923]: |          * = o==|
Oct 01 13:00:50 np0005464591.novalocal cloud-init[923]: |        S= =..o+o|
Oct 01 13:00:50 np0005464591.novalocal cloud-init[923]: |       .+ o .=.. |
Oct 01 13:00:50 np0005464591.novalocal cloud-init[923]: |      ...o .. B  |
Oct 01 13:00:50 np0005464591.novalocal cloud-init[923]: |     . .+ . .= . |
Oct 01 13:00:50 np0005464591.novalocal cloud-init[923]: |     .o. +. oo   |
Oct 01 13:00:50 np0005464591.novalocal cloud-init[923]: +----[SHA256]-----+
Oct 01 13:00:50 np0005464591.novalocal cloud-init[923]: Generating public/private ecdsa key pair.
Oct 01 13:00:50 np0005464591.novalocal cloud-init[923]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Oct 01 13:00:50 np0005464591.novalocal cloud-init[923]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Oct 01 13:00:50 np0005464591.novalocal cloud-init[923]: The key fingerprint is:
Oct 01 13:00:50 np0005464591.novalocal cloud-init[923]: SHA256:hepui6Wv9fMh5BSgbkucgTEAX01few2LWNcn+QIP1Ck root@np0005464591.novalocal
Oct 01 13:00:50 np0005464591.novalocal cloud-init[923]: The key's randomart image is:
Oct 01 13:00:50 np0005464591.novalocal cloud-init[923]: +---[ECDSA 256]---+
Oct 01 13:00:50 np0005464591.novalocal cloud-init[923]: |+.o .oo   o.+o o |
Oct 01 13:00:50 np0005464591.novalocal cloud-init[923]: | . = ..o = +E+* .|
Oct 01 13:00:50 np0005464591.novalocal cloud-init[923]: |  o o   = + o=.+ |
Oct 01 13:00:50 np0005464591.novalocal cloud-init[923]: |   o o . o .  o .|
Oct 01 13:00:50 np0005464591.novalocal cloud-init[923]: |    * . S      . |
Oct 01 13:00:50 np0005464591.novalocal cloud-init[923]: |   o o +         |
Oct 01 13:00:50 np0005464591.novalocal cloud-init[923]: |    . + o .      |
Oct 01 13:00:50 np0005464591.novalocal cloud-init[923]: |     *.... .     |
Oct 01 13:00:50 np0005464591.novalocal cloud-init[923]: |    +++..o.      |
Oct 01 13:00:50 np0005464591.novalocal cloud-init[923]: +----[SHA256]-----+
Oct 01 13:00:50 np0005464591.novalocal cloud-init[923]: Generating public/private ed25519 key pair.
Oct 01 13:00:50 np0005464591.novalocal cloud-init[923]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Oct 01 13:00:50 np0005464591.novalocal cloud-init[923]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Oct 01 13:00:50 np0005464591.novalocal cloud-init[923]: The key fingerprint is:
Oct 01 13:00:50 np0005464591.novalocal cloud-init[923]: SHA256:d79iYgGHCKTRc0no4o56zDqLkYl7spixomisqLlO9kc root@np0005464591.novalocal
Oct 01 13:00:50 np0005464591.novalocal cloud-init[923]: The key's randomart image is:
Oct 01 13:00:50 np0005464591.novalocal cloud-init[923]: +--[ED25519 256]--+
Oct 01 13:00:50 np0005464591.novalocal cloud-init[923]: |  .o.o..         |
Oct 01 13:00:50 np0005464591.novalocal cloud-init[923]: |   o= o          |
Oct 01 13:00:50 np0005464591.novalocal cloud-init[923]: |  .. + . .       |
Oct 01 13:00:50 np0005464591.novalocal cloud-init[923]: |  . . . o .      |
Oct 01 13:00:50 np0005464591.novalocal cloud-init[923]: | . .    So. .    |
Oct 01 13:00:50 np0005464591.novalocal cloud-init[923]: |.o.  E   ... .   |
Oct 01 13:00:50 np0005464591.novalocal cloud-init[923]: |BB  .      .  .  |
Oct 01 13:00:50 np0005464591.novalocal cloud-init[923]: |%XB  .    o o  . |
Oct 01 13:00:50 np0005464591.novalocal cloud-init[923]: |^@ ..    . o ..  |
Oct 01 13:00:50 np0005464591.novalocal cloud-init[923]: +----[SHA256]-----+
Oct 01 13:00:50 np0005464591.novalocal systemd[1]: Finished Cloud-init: Network Stage.
Oct 01 13:00:50 np0005464591.novalocal systemd[1]: Reached target Cloud-config availability.
Oct 01 13:00:50 np0005464591.novalocal systemd[1]: Reached target Network is Online.
Oct 01 13:00:50 np0005464591.novalocal systemd[1]: Starting Cloud-init: Config Stage...
Oct 01 13:00:50 np0005464591.novalocal systemd[1]: Starting Notify NFS peers of a restart...
Oct 01 13:00:50 np0005464591.novalocal systemd[1]: Starting System Logging Service...
Oct 01 13:00:50 np0005464591.novalocal systemd[1]: Starting OpenSSH server daemon...
Oct 01 13:00:50 np0005464591.novalocal sm-notify[1005]: Version 2.5.4 starting
Oct 01 13:00:50 np0005464591.novalocal systemd[1]: Starting Permit User Sessions...
Oct 01 13:00:50 np0005464591.novalocal systemd[1]: Started Notify NFS peers of a restart.
Oct 01 13:00:50 np0005464591.novalocal sshd[1007]: Server listening on 0.0.0.0 port 22.
Oct 01 13:00:50 np0005464591.novalocal sshd[1007]: Server listening on :: port 22.
Oct 01 13:00:50 np0005464591.novalocal systemd[1]: Started OpenSSH server daemon.
Oct 01 13:00:50 np0005464591.novalocal systemd[1]: Finished Permit User Sessions.
Oct 01 13:00:50 np0005464591.novalocal systemd[1]: Started Command Scheduler.
Oct 01 13:00:50 np0005464591.novalocal systemd[1]: Started Getty on tty1.
Oct 01 13:00:50 np0005464591.novalocal crond[1009]: (CRON) STARTUP (1.5.7)
Oct 01 13:00:50 np0005464591.novalocal crond[1009]: (CRON) INFO (Syslog will be used instead of sendmail.)
Oct 01 13:00:50 np0005464591.novalocal systemd[1]: Started Serial Getty on ttyS0.
Oct 01 13:00:50 np0005464591.novalocal crond[1009]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 71% if used.)
Oct 01 13:00:50 np0005464591.novalocal crond[1009]: (CRON) INFO (running with inotify support)
Oct 01 13:00:50 np0005464591.novalocal systemd[1]: Reached target Login Prompts.
Oct 01 13:00:50 np0005464591.novalocal rsyslogd[1006]: [origin software="rsyslogd" swVersion="8.2506.0-2.el9" x-pid="1006" x-info="https://www.rsyslog.com"] start
Oct 01 13:00:50 np0005464591.novalocal rsyslogd[1006]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Oct 01 13:00:50 np0005464591.novalocal systemd[1]: Started System Logging Service.
Oct 01 13:00:50 np0005464591.novalocal systemd[1]: Reached target Multi-User System.
Oct 01 13:00:50 np0005464591.novalocal systemd[1]: Starting Record Runlevel Change in UTMP...
Oct 01 13:00:50 np0005464591.novalocal systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Oct 01 13:00:50 np0005464591.novalocal systemd[1]: Finished Record Runlevel Change in UTMP.
Oct 01 13:00:50 np0005464591.novalocal rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 01 13:00:50 np0005464591.novalocal cloud-init[1020]: Cloud-init v. 24.4-7.el9 running 'modules:config' at Wed, 01 Oct 2025 13:00:50 +0000. Up 11.46 seconds.
Oct 01 13:00:51 np0005464591.novalocal systemd[1]: Finished Cloud-init: Config Stage.
Oct 01 13:00:51 np0005464591.novalocal systemd[1]: Starting Cloud-init: Final Stage...
Oct 01 13:00:51 np0005464591.novalocal cloud-init[1024]: Cloud-init v. 24.4-7.el9 running 'modules:final' at Wed, 01 Oct 2025 13:00:51 +0000. Up 11.90 seconds.
Oct 01 13:00:51 np0005464591.novalocal cloud-init[1026]: #############################################################
Oct 01 13:00:51 np0005464591.novalocal cloud-init[1027]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Oct 01 13:00:51 np0005464591.novalocal cloud-init[1029]: 256 SHA256:hepui6Wv9fMh5BSgbkucgTEAX01few2LWNcn+QIP1Ck root@np0005464591.novalocal (ECDSA)
Oct 01 13:00:51 np0005464591.novalocal cloud-init[1031]: 256 SHA256:d79iYgGHCKTRc0no4o56zDqLkYl7spixomisqLlO9kc root@np0005464591.novalocal (ED25519)
Oct 01 13:00:51 np0005464591.novalocal cloud-init[1033]: 3072 SHA256:Lu534C+RTWyx275bZoUu8UORgH4W9z1celookU4kAfc root@np0005464591.novalocal (RSA)
Oct 01 13:00:51 np0005464591.novalocal cloud-init[1034]: -----END SSH HOST KEY FINGERPRINTS-----
Oct 01 13:00:51 np0005464591.novalocal cloud-init[1035]: #############################################################
Oct 01 13:00:51 np0005464591.novalocal cloud-init[1024]: Cloud-init v. 24.4-7.el9 finished at Wed, 01 Oct 2025 13:00:51 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 12.09 seconds
Oct 01 13:00:51 np0005464591.novalocal systemd[1]: Finished Cloud-init: Final Stage.
Oct 01 13:00:51 np0005464591.novalocal systemd[1]: Reached target Cloud-init target.
Oct 01 13:00:51 np0005464591.novalocal systemd[1]: Startup finished in 1.559s (kernel) + 3.203s (initrd) + 7.404s (userspace) = 12.167s.
Oct 01 13:00:52 np0005464591.novalocal sshd-session[1042]: Unable to negotiate with 38.102.83.114 port 59024: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Oct 01 13:00:52 np0005464591.novalocal sshd-session[1046]: Unable to negotiate with 38.102.83.114 port 59050: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Oct 01 13:00:52 np0005464591.novalocal sshd-session[1048]: Unable to negotiate with 38.102.83.114 port 59062: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Oct 01 13:00:52 np0005464591.novalocal sshd-session[1040]: Connection closed by 38.102.83.114 port 59010 [preauth]
Oct 01 13:00:52 np0005464591.novalocal sshd-session[1044]: Connection closed by 38.102.83.114 port 59034 [preauth]
Oct 01 13:00:52 np0005464591.novalocal sshd-session[1054]: Unable to negotiate with 38.102.83.114 port 59088: no matching host key type found. Their offer: ssh-rsa,ssh-rsa-cert-v01@openssh.com [preauth]
Oct 01 13:00:52 np0005464591.novalocal sshd-session[1056]: Unable to negotiate with 38.102.83.114 port 59090: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Oct 01 13:00:52 np0005464591.novalocal sshd-session[1050]: Connection closed by 38.102.83.114 port 59068 [preauth]
Oct 01 13:00:52 np0005464591.novalocal sshd-session[1052]: Connection closed by 38.102.83.114 port 59076 [preauth]
Oct 01 13:00:53 np0005464591.novalocal chronyd[793]: Selected source 54.39.23.64 (2.centos.pool.ntp.org)
Oct 01 13:00:53 np0005464591.novalocal chronyd[793]: System clock TAI offset set to 37 seconds
Oct 01 13:00:56 np0005464591.novalocal irqbalance[786]: Cannot change IRQ 25 affinity: Operation not permitted
Oct 01 13:00:56 np0005464591.novalocal irqbalance[786]: IRQ 25 affinity is now unmanaged
Oct 01 13:00:56 np0005464591.novalocal irqbalance[786]: Cannot change IRQ 31 affinity: Operation not permitted
Oct 01 13:00:56 np0005464591.novalocal irqbalance[786]: IRQ 31 affinity is now unmanaged
Oct 01 13:00:56 np0005464591.novalocal irqbalance[786]: Cannot change IRQ 28 affinity: Operation not permitted
Oct 01 13:00:56 np0005464591.novalocal irqbalance[786]: IRQ 28 affinity is now unmanaged
Oct 01 13:00:56 np0005464591.novalocal irqbalance[786]: Cannot change IRQ 32 affinity: Operation not permitted
Oct 01 13:00:56 np0005464591.novalocal irqbalance[786]: IRQ 32 affinity is now unmanaged
Oct 01 13:00:56 np0005464591.novalocal irqbalance[786]: Cannot change IRQ 30 affinity: Operation not permitted
Oct 01 13:00:56 np0005464591.novalocal irqbalance[786]: IRQ 30 affinity is now unmanaged
Oct 01 13:00:56 np0005464591.novalocal irqbalance[786]: Cannot change IRQ 29 affinity: Operation not permitted
Oct 01 13:00:56 np0005464591.novalocal irqbalance[786]: IRQ 29 affinity is now unmanaged
Oct 01 13:00:58 np0005464591.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct 01 13:01:01 np0005464591.novalocal CROND[1059]: (root) CMD (run-parts /etc/cron.hourly)
Oct 01 13:01:01 np0005464591.novalocal run-parts[1062]: (/etc/cron.hourly) starting 0anacron
Oct 01 13:01:01 np0005464591.novalocal anacron[1070]: Anacron started on 2025-10-01
Oct 01 13:01:01 np0005464591.novalocal anacron[1070]: Will run job `cron.daily' in 32 min.
Oct 01 13:01:01 np0005464591.novalocal anacron[1070]: Will run job `cron.weekly' in 52 min.
Oct 01 13:01:01 np0005464591.novalocal anacron[1070]: Will run job `cron.monthly' in 72 min.
Oct 01 13:01:01 np0005464591.novalocal anacron[1070]: Jobs will be executed sequentially
Oct 01 13:01:01 np0005464591.novalocal run-parts[1072]: (/etc/cron.hourly) finished 0anacron
Oct 01 13:01:01 np0005464591.novalocal CROND[1058]: (root) CMDEND (run-parts /etc/cron.hourly)
Oct 01 13:01:12 np0005464591.novalocal sshd-session[1073]: Accepted publickey for zuul from 38.102.83.114 port 34908 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Oct 01 13:01:12 np0005464591.novalocal systemd[1]: Created slice User Slice of UID 1000.
Oct 01 13:01:12 np0005464591.novalocal systemd[1]: Starting User Runtime Directory /run/user/1000...
Oct 01 13:01:12 np0005464591.novalocal systemd-logind[791]: New session 1 of user zuul.
Oct 01 13:01:12 np0005464591.novalocal systemd[1]: Finished User Runtime Directory /run/user/1000.
Oct 01 13:01:12 np0005464591.novalocal systemd[1]: Starting User Manager for UID 1000...
Oct 01 13:01:12 np0005464591.novalocal systemd[1077]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 01 13:01:12 np0005464591.novalocal systemd[1077]: Queued start job for default target Main User Target.
Oct 01 13:01:12 np0005464591.novalocal systemd[1077]: Created slice User Application Slice.
Oct 01 13:01:12 np0005464591.novalocal systemd[1077]: Started Mark boot as successful after the user session has run 2 minutes.
Oct 01 13:01:12 np0005464591.novalocal systemd[1077]: Started Daily Cleanup of User's Temporary Directories.
Oct 01 13:01:12 np0005464591.novalocal systemd[1077]: Reached target Paths.
Oct 01 13:01:12 np0005464591.novalocal systemd[1077]: Reached target Timers.
Oct 01 13:01:12 np0005464591.novalocal systemd[1077]: Starting D-Bus User Message Bus Socket...
Oct 01 13:01:12 np0005464591.novalocal systemd[1077]: Starting Create User's Volatile Files and Directories...
Oct 01 13:01:12 np0005464591.novalocal systemd[1077]: Finished Create User's Volatile Files and Directories.
Oct 01 13:01:12 np0005464591.novalocal systemd[1077]: Listening on D-Bus User Message Bus Socket.
Oct 01 13:01:12 np0005464591.novalocal systemd[1077]: Reached target Sockets.
Oct 01 13:01:12 np0005464591.novalocal systemd[1077]: Reached target Basic System.
Oct 01 13:01:12 np0005464591.novalocal systemd[1077]: Reached target Main User Target.
Oct 01 13:01:12 np0005464591.novalocal systemd[1077]: Startup finished in 133ms.
Oct 01 13:01:12 np0005464591.novalocal systemd[1]: Started User Manager for UID 1000.
Oct 01 13:01:12 np0005464591.novalocal systemd[1]: Started Session 1 of User zuul.
Oct 01 13:01:12 np0005464591.novalocal sshd-session[1073]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 01 13:01:13 np0005464591.novalocal python3[1159]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 01 13:01:16 np0005464591.novalocal python3[1187]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 01 13:01:17 np0005464591.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct 01 13:01:17 np0005464591.novalocal sshd-session[1188]: Connection closed by authenticating user root 185.156.73.233 port 46106 [preauth]
Oct 01 13:01:25 np0005464591.novalocal python3[1249]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 01 13:01:26 np0005464591.novalocal python3[1289]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Oct 01 13:01:28 np0005464591.novalocal python3[1315]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCyVnxu92lxwfUidt9dk9bKce3oXPc+r/v6/KpHn5T2xkPGmOEncfg6YeWwIWxfIRiEUrgoAj8MqvSYB1z9u9kz8DLgpOjExGzO0uGc1m7QsPxlWeNI9t9pCXeYDBeOYcfln7gli4PP1MTPAb4Ka4uANNlZSyq1KQn6LjutMLvSfLXXK1YTBS3HxU977rm3N65JjCx+uQJF2PxhtWk6AWvt6LebPoH8uIZsPtBwwfNJYaKA+ZD8PHJCkN2XLE9LJXcsWcZJ5mboiVwre4iqsMCwCfavGyZ64bJisUHapM36VnwZBwnmrG+otI4WJKbnGfpTSrJyYfSIxcWjbQXhBdJKeq2QXb6CsPaqO1VuAOfFWvFJW80wsfM0wXVd2922DqlNCwyERMqXdWOI7umS6d/NfJMqPVUWhwd8JNfTb14seiwQh+yhIen1cZwx4Rq5Y5SQlVmKKWYmw4XWj7/cghMUM5KDF6XzYTJvdPhruZcZFV97zaZFO7f/db6a1MPkIbU= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 01 13:01:28 np0005464591.novalocal python3[1339]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:01:29 np0005464591.novalocal python3[1438]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 01 13:01:29 np0005464591.novalocal python3[1509]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759323688.7032058-229-75277063447788/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=295c928d1f104a949fcb982109af39da_id_rsa follow=False checksum=3fc4a6dcb39ae00dc6e1df95b1abe78d9e047358 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:01:30 np0005464591.novalocal python3[1632]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 01 13:01:30 np0005464591.novalocal python3[1703]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759323689.7584388-273-53487121986162/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=295c928d1f104a949fcb982109af39da_id_rsa.pub follow=False checksum=a0ef47ccbc5b8fd981f3bb7b73390fa046e84374 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:01:31 np0005464591.novalocal python3[1751]: ansible-ping Invoked with data=pong
Oct 01 13:01:32 np0005464591.novalocal python3[1775]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 01 13:01:34 np0005464591.novalocal python3[1833]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Oct 01 13:01:35 np0005464591.novalocal python3[1865]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:01:36 np0005464591.novalocal python3[1889]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:01:36 np0005464591.novalocal python3[1913]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:01:36 np0005464591.novalocal irqbalance[786]: Cannot change IRQ 26 affinity: Operation not permitted
Oct 01 13:01:36 np0005464591.novalocal irqbalance[786]: IRQ 26 affinity is now unmanaged
Oct 01 13:01:36 np0005464591.novalocal python3[1937]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:01:37 np0005464591.novalocal python3[1961]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:01:37 np0005464591.novalocal python3[1985]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:01:38 np0005464591.novalocal sudo[2009]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rligmyujddaflgznmihrlzzbocfipilz ; /usr/bin/python3'
Oct 01 13:01:38 np0005464591.novalocal sudo[2009]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:01:39 np0005464591.novalocal python3[2011]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:01:39 np0005464591.novalocal sudo[2009]: pam_unix(sudo:session): session closed for user root
Oct 01 13:01:39 np0005464591.novalocal sudo[2087]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnlcbjhbqlmkyucxvczvfdovzejgkqup ; /usr/bin/python3'
Oct 01 13:01:39 np0005464591.novalocal sudo[2087]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:01:39 np0005464591.novalocal python3[2089]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 01 13:01:39 np0005464591.novalocal sudo[2087]: pam_unix(sudo:session): session closed for user root
Oct 01 13:01:39 np0005464591.novalocal sudo[2160]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jyvtbqibpeyqzpuvslnaryqbvrcsikle ; /usr/bin/python3'
Oct 01 13:01:39 np0005464591.novalocal sudo[2160]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:01:40 np0005464591.novalocal python3[2162]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759323699.1526487-26-262021872199099/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:01:40 np0005464591.novalocal sudo[2160]: pam_unix(sudo:session): session closed for user root
Oct 01 13:01:40 np0005464591.novalocal python3[2210]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 01 13:01:41 np0005464591.novalocal python3[2234]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 01 13:01:41 np0005464591.novalocal python3[2258]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 01 13:01:41 np0005464591.novalocal python3[2282]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 01 13:01:42 np0005464591.novalocal python3[2306]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 01 13:01:42 np0005464591.novalocal python3[2330]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 01 13:01:42 np0005464591.novalocal python3[2354]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 01 13:01:42 np0005464591.novalocal python3[2378]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 01 13:01:43 np0005464591.novalocal python3[2402]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 01 13:01:43 np0005464591.novalocal python3[2426]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 01 13:01:43 np0005464591.novalocal python3[2450]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 01 13:01:44 np0005464591.novalocal python3[2474]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 01 13:01:44 np0005464591.novalocal python3[2498]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 01 13:01:44 np0005464591.novalocal python3[2522]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 01 13:01:45 np0005464591.novalocal python3[2546]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 01 13:01:45 np0005464591.novalocal python3[2570]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 01 13:01:45 np0005464591.novalocal python3[2594]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 01 13:01:45 np0005464591.novalocal python3[2618]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 01 13:01:46 np0005464591.novalocal python3[2642]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 01 13:01:46 np0005464591.novalocal python3[2666]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 01 13:01:46 np0005464591.novalocal python3[2690]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 01 13:01:47 np0005464591.novalocal python3[2714]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 01 13:01:47 np0005464591.novalocal python3[2738]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 01 13:01:47 np0005464591.novalocal python3[2762]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 01 13:01:47 np0005464591.novalocal python3[2786]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 01 13:01:48 np0005464591.novalocal python3[2810]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 01 13:01:50 np0005464591.novalocal sudo[2834]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxqybilsukrqijtaijfitjstszgvlyrp ; /usr/bin/python3'
Oct 01 13:01:50 np0005464591.novalocal sudo[2834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:01:50 np0005464591.novalocal python3[2836]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Oct 01 13:01:50 np0005464591.novalocal systemd[1]: Starting Time & Date Service...
Oct 01 13:01:50 np0005464591.novalocal systemd[1]: Started Time & Date Service.
Oct 01 13:01:50 np0005464591.novalocal systemd-timedated[2838]: Changed time zone to 'UTC' (UTC).
Oct 01 13:01:50 np0005464591.novalocal sudo[2834]: pam_unix(sudo:session): session closed for user root
Oct 01 13:01:51 np0005464591.novalocal sudo[2865]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aflattncknggqiehsctuoehqrpxnlsdf ; /usr/bin/python3'
Oct 01 13:01:51 np0005464591.novalocal sudo[2865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:01:51 np0005464591.novalocal python3[2867]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:01:51 np0005464591.novalocal sudo[2865]: pam_unix(sudo:session): session closed for user root
Oct 01 13:01:51 np0005464591.novalocal python3[2943]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 01 13:01:52 np0005464591.novalocal python3[3014]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1759323711.475183-202-161630586838481/source _original_basename=tmp_wfx0ekq follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:01:52 np0005464591.novalocal python3[3114]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 01 13:01:53 np0005464591.novalocal python3[3185]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1759323712.4476075-242-127676419520869/source _original_basename=tmpfj17mwuh follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:01:53 np0005464591.novalocal sudo[3285]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svbxafluofnbaqxkjsknbmzjpwyocvnv ; /usr/bin/python3'
Oct 01 13:01:53 np0005464591.novalocal sudo[3285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:01:54 np0005464591.novalocal python3[3287]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 01 13:01:54 np0005464591.novalocal sudo[3285]: pam_unix(sudo:session): session closed for user root
Oct 01 13:01:54 np0005464591.novalocal sudo[3358]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzkoifiirlxtlakovvrybpcpqqezsbzd ; /usr/bin/python3'
Oct 01 13:01:54 np0005464591.novalocal sudo[3358]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:01:54 np0005464591.novalocal python3[3360]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1759323713.756376-306-242827019892408/source _original_basename=tmp1v2amj_g follow=False checksum=3890c647ec31611ac0b34a57deacc59e0e355a1f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:01:54 np0005464591.novalocal sudo[3358]: pam_unix(sudo:session): session closed for user root
Oct 01 13:01:55 np0005464591.novalocal python3[3408]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:01:55 np0005464591.novalocal python3[3434]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:01:55 np0005464591.novalocal sudo[3512]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nvfsmtyhnkkcxlolfqvlojkvltrprhes ; /usr/bin/python3'
Oct 01 13:01:55 np0005464591.novalocal sudo[3512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:01:56 np0005464591.novalocal python3[3514]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 01 13:01:56 np0005464591.novalocal sudo[3512]: pam_unix(sudo:session): session closed for user root
Oct 01 13:01:56 np0005464591.novalocal sudo[3585]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eioelapwnkfndhtudrnelmibohtpqlbp ; /usr/bin/python3'
Oct 01 13:01:56 np0005464591.novalocal sudo[3585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:01:56 np0005464591.novalocal python3[3587]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1759323715.7642655-362-126496390619364/source _original_basename=tmpudydxmp4 follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:01:56 np0005464591.novalocal sudo[3585]: pam_unix(sudo:session): session closed for user root
Oct 01 13:01:57 np0005464591.novalocal sudo[3636]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxyraiqcbixtxwnjlddqlakemxmhbblx ; /usr/bin/python3'
Oct 01 13:01:57 np0005464591.novalocal sudo[3636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:01:57 np0005464591.novalocal python3[3638]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ef9-e89a-db67-4db8-00000000001e-1-compute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:01:57 np0005464591.novalocal sudo[3636]: pam_unix(sudo:session): session closed for user root
Oct 01 13:01:57 np0005464591.novalocal python3[3666]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-db67-4db8-00000000001f-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Oct 01 13:01:59 np0005464591.novalocal python3[3695]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:02:16 np0005464591.novalocal sudo[3719]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjqgutuboedxmrbjurpbjudfdsocargp ; /usr/bin/python3'
Oct 01 13:02:16 np0005464591.novalocal sudo[3719]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:02:16 np0005464591.novalocal python3[3721]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:02:16 np0005464591.novalocal sudo[3719]: pam_unix(sudo:session): session closed for user root
Oct 01 13:02:20 np0005464591.novalocal systemd[1]: systemd-timedated.service: Deactivated successfully.
Oct 01 13:02:55 np0005464591.novalocal kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Oct 01 13:02:55 np0005464591.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Oct 01 13:02:55 np0005464591.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Oct 01 13:02:55 np0005464591.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Oct 01 13:02:55 np0005464591.novalocal kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Oct 01 13:02:55 np0005464591.novalocal kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Oct 01 13:02:55 np0005464591.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Oct 01 13:02:55 np0005464591.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Oct 01 13:02:55 np0005464591.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Oct 01 13:02:55 np0005464591.novalocal kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Oct 01 13:02:56 np0005464591.novalocal NetworkManager[860]: <info>  [1759323776.0131] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Oct 01 13:02:56 np0005464591.novalocal systemd-udevd[3724]: Network interface NamePolicy= disabled on kernel command line.
Oct 01 13:02:56 np0005464591.novalocal NetworkManager[860]: <info>  [1759323776.0436] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 01 13:02:56 np0005464591.novalocal NetworkManager[860]: <info>  [1759323776.0493] settings: (eth1): created default wired connection 'Wired connection 1'
Oct 01 13:02:56 np0005464591.novalocal NetworkManager[860]: <info>  [1759323776.0500] device (eth1): carrier: link connected
Oct 01 13:02:56 np0005464591.novalocal NetworkManager[860]: <info>  [1759323776.0505] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Oct 01 13:02:56 np0005464591.novalocal NetworkManager[860]: <info>  [1759323776.0516] policy: auto-activating connection 'Wired connection 1' (899ff706-8c33-3033-a1ad-6ae636ae4542)
Oct 01 13:02:56 np0005464591.novalocal NetworkManager[860]: <info>  [1759323776.0523] device (eth1): Activation: starting connection 'Wired connection 1' (899ff706-8c33-3033-a1ad-6ae636ae4542)
Oct 01 13:02:56 np0005464591.novalocal NetworkManager[860]: <info>  [1759323776.0525] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 01 13:02:56 np0005464591.novalocal NetworkManager[860]: <info>  [1759323776.0530] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 01 13:02:56 np0005464591.novalocal NetworkManager[860]: <info>  [1759323776.0538] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 01 13:02:56 np0005464591.novalocal NetworkManager[860]: <info>  [1759323776.0546] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Oct 01 13:02:57 np0005464591.novalocal python3[3751]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ef9-e89a-25e3-6ecf-000000000112-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:03:04 np0005464591.novalocal sudo[3829]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlmokpckzcbrxizdgzvfocvpujfuykdd ; OS_CLOUD=vexxhost /usr/bin/python3'
Oct 01 13:03:04 np0005464591.novalocal sudo[3829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:03:04 np0005464591.novalocal python3[3831]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 01 13:03:04 np0005464591.novalocal sudo[3829]: pam_unix(sudo:session): session closed for user root
Oct 01 13:03:04 np0005464591.novalocal sudo[3902]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmzfjfdxyzokqsdrhqplqqjiswyzcomx ; OS_CLOUD=vexxhost /usr/bin/python3'
Oct 01 13:03:04 np0005464591.novalocal sudo[3902]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:03:04 np0005464591.novalocal python3[3904]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759323784.1659582-103-64739518018774/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=406e0cb981d7fbf8419f49bcf2c0553181d3c7ac backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:03:04 np0005464591.novalocal sudo[3902]: pam_unix(sudo:session): session closed for user root
Oct 01 13:03:05 np0005464591.novalocal sudo[3952]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xifgooxudmunrviqxbvrpkwgdazmlsyl ; OS_CLOUD=vexxhost /usr/bin/python3'
Oct 01 13:03:05 np0005464591.novalocal sudo[3952]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:03:05 np0005464591.novalocal python3[3954]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 01 13:03:05 np0005464591.novalocal systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Oct 01 13:03:05 np0005464591.novalocal systemd[1]: Stopped Network Manager Wait Online.
Oct 01 13:03:05 np0005464591.novalocal systemd[1]: Stopping Network Manager Wait Online...
Oct 01 13:03:05 np0005464591.novalocal systemd[1]: Stopping Network Manager...
Oct 01 13:03:05 np0005464591.novalocal NetworkManager[860]: <info>  [1759323785.9903] caught SIGTERM, shutting down normally.
Oct 01 13:03:05 np0005464591.novalocal NetworkManager[860]: <info>  [1759323785.9916] dhcp4 (eth0): canceled DHCP transaction
Oct 01 13:03:05 np0005464591.novalocal NetworkManager[860]: <info>  [1759323785.9916] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct 01 13:03:05 np0005464591.novalocal NetworkManager[860]: <info>  [1759323785.9916] dhcp4 (eth0): state changed no lease
Oct 01 13:03:05 np0005464591.novalocal NetworkManager[860]: <info>  [1759323785.9919] manager: NetworkManager state is now CONNECTING
Oct 01 13:03:06 np0005464591.novalocal NetworkManager[860]: <info>  [1759323786.0023] dhcp4 (eth1): canceled DHCP transaction
Oct 01 13:03:06 np0005464591.novalocal NetworkManager[860]: <info>  [1759323786.0024] dhcp4 (eth1): state changed no lease
Oct 01 13:03:06 np0005464591.novalocal NetworkManager[860]: <info>  [1759323786.0074] exiting (success)
Oct 01 13:03:06 np0005464591.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct 01 13:03:06 np0005464591.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Oct 01 13:03:06 np0005464591.novalocal systemd[1]: NetworkManager.service: Deactivated successfully.
Oct 01 13:03:06 np0005464591.novalocal systemd[1]: Stopped Network Manager.
Oct 01 13:03:06 np0005464591.novalocal systemd[1]: Starting Network Manager...
Oct 01 13:03:06 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323786.0858] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:1fa3a04c-5158-4a26-9fa0-b8b34ae08d38)
Oct 01 13:03:06 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323786.0860] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Oct 01 13:03:06 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323786.0926] manager[0x55d7f17fe070]: monitoring kernel firmware directory '/lib/firmware'.
Oct 01 13:03:06 np0005464591.novalocal systemd[1]: Starting Hostname Service...
Oct 01 13:03:06 np0005464591.novalocal systemd[1]: Started Hostname Service.
Oct 01 13:03:06 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323786.2132] hostname: hostname: using hostnamed
Oct 01 13:03:06 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323786.2133] hostname: static hostname changed from (none) to "np0005464591.novalocal"
Oct 01 13:03:06 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323786.2138] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Oct 01 13:03:06 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323786.2144] manager[0x55d7f17fe070]: rfkill: Wi-Fi hardware radio set enabled
Oct 01 13:03:06 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323786.2144] manager[0x55d7f17fe070]: rfkill: WWAN hardware radio set enabled
Oct 01 13:03:06 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323786.2176] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Oct 01 13:03:06 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323786.2177] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Oct 01 13:03:06 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323786.2177] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Oct 01 13:03:06 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323786.2178] manager: Networking is enabled by state file
Oct 01 13:03:06 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323786.2180] settings: Loaded settings plugin: keyfile (internal)
Oct 01 13:03:06 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323786.2186] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Oct 01 13:03:06 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323786.2212] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Oct 01 13:03:06 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323786.2222] dhcp: init: Using DHCP client 'internal'
Oct 01 13:03:06 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323786.2225] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Oct 01 13:03:06 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323786.2230] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 01 13:03:06 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323786.2236] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Oct 01 13:03:06 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323786.2245] device (lo): Activation: starting connection 'lo' (d1361516-740f-4fdb-ad0c-6174cd593c78)
Oct 01 13:03:06 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323786.2253] device (eth0): carrier: link connected
Oct 01 13:03:06 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323786.2258] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Oct 01 13:03:06 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323786.2262] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Oct 01 13:03:06 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323786.2263] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Oct 01 13:03:06 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323786.2269] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Oct 01 13:03:06 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323786.2276] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Oct 01 13:03:06 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323786.2282] device (eth1): carrier: link connected
Oct 01 13:03:06 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323786.2287] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Oct 01 13:03:06 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323786.2292] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (899ff706-8c33-3033-a1ad-6ae636ae4542) (indicated)
Oct 01 13:03:06 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323786.2292] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Oct 01 13:03:06 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323786.2298] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Oct 01 13:03:06 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323786.2305] device (eth1): Activation: starting connection 'Wired connection 1' (899ff706-8c33-3033-a1ad-6ae636ae4542)
Oct 01 13:03:06 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323786.2311] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Oct 01 13:03:06 np0005464591.novalocal systemd[1]: Started Network Manager.
Oct 01 13:03:06 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323786.2315] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Oct 01 13:03:06 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323786.2318] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Oct 01 13:03:06 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323786.2320] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Oct 01 13:03:06 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323786.2323] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Oct 01 13:03:06 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323786.2325] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Oct 01 13:03:06 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323786.2327] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Oct 01 13:03:06 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323786.2330] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Oct 01 13:03:06 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323786.2332] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Oct 01 13:03:06 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323786.2338] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Oct 01 13:03:06 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323786.2340] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct 01 13:03:06 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323786.2349] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Oct 01 13:03:06 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323786.2351] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Oct 01 13:03:06 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323786.2373] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Oct 01 13:03:06 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323786.2375] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Oct 01 13:03:06 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323786.2381] device (lo): Activation: successful, device activated.
Oct 01 13:03:06 np0005464591.novalocal systemd[1]: Starting Network Manager Wait Online...
Oct 01 13:03:06 np0005464591.novalocal sudo[3952]: pam_unix(sudo:session): session closed for user root
Oct 01 13:03:06 np0005464591.novalocal python3[4019]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ef9-e89a-25e3-6ecf-0000000000b2-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:03:07 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323787.5636] dhcp4 (eth0): state changed new lease, address=38.102.83.163
Oct 01 13:03:07 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323787.5651] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Oct 01 13:03:07 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323787.5747] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Oct 01 13:03:07 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323787.5792] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Oct 01 13:03:07 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323787.5794] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Oct 01 13:03:07 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323787.5797] manager: NetworkManager state is now CONNECTED_SITE
Oct 01 13:03:07 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323787.5800] device (eth0): Activation: successful, device activated.
Oct 01 13:03:07 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323787.5806] manager: NetworkManager state is now CONNECTED_GLOBAL
Oct 01 13:03:17 np0005464591.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct 01 13:03:36 np0005464591.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct 01 13:03:51 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323831.4044] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Oct 01 13:03:51 np0005464591.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct 01 13:03:51 np0005464591.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Oct 01 13:03:51 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323831.4327] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Oct 01 13:03:51 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323831.4331] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Oct 01 13:03:51 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323831.4348] device (eth1): Activation: successful, device activated.
Oct 01 13:03:51 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323831.4356] manager: startup complete
Oct 01 13:03:51 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323831.4359] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Oct 01 13:03:51 np0005464591.novalocal NetworkManager[3964]: <warn>  [1759323831.4372] device (eth1): Activation: failed for connection 'Wired connection 1'
Oct 01 13:03:51 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323831.4379] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Oct 01 13:03:51 np0005464591.novalocal systemd[1]: Finished Network Manager Wait Online.
Oct 01 13:03:51 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323831.4564] dhcp4 (eth1): canceled DHCP transaction
Oct 01 13:03:51 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323831.4565] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Oct 01 13:03:51 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323831.4565] dhcp4 (eth1): state changed no lease
Oct 01 13:03:51 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323831.4588] policy: auto-activating connection 'ci-private-network' (601254e3-3abf-5e1b-b4d3-7e1a095eff98)
Oct 01 13:03:51 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323831.4594] device (eth1): Activation: starting connection 'ci-private-network' (601254e3-3abf-5e1b-b4d3-7e1a095eff98)
Oct 01 13:03:51 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323831.4595] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 01 13:03:51 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323831.4597] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 01 13:03:51 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323831.4606] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 01 13:03:51 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323831.4614] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 01 13:03:51 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323831.4650] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 01 13:03:51 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323831.4651] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 01 13:03:51 np0005464591.novalocal NetworkManager[3964]: <info>  [1759323831.4659] device (eth1): Activation: successful, device activated.
Oct 01 13:04:01 np0005464591.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct 01 13:04:06 np0005464591.novalocal systemd[1077]: Starting Mark boot as successful...
Oct 01 13:04:06 np0005464591.novalocal systemd[1077]: Finished Mark boot as successful.
Oct 01 13:04:06 np0005464591.novalocal sshd-session[1086]: Received disconnect from 38.102.83.114 port 34908:11: disconnected by user
Oct 01 13:04:06 np0005464591.novalocal sshd-session[1086]: Disconnected from user zuul 38.102.83.114 port 34908
Oct 01 13:04:06 np0005464591.novalocal sshd-session[1073]: pam_unix(sshd:session): session closed for user zuul
Oct 01 13:04:06 np0005464591.novalocal systemd-logind[791]: Session 1 logged out. Waiting for processes to exit.
Oct 01 13:04:32 np0005464591.novalocal sshd-session[4067]: Accepted publickey for zuul from 38.102.83.114 port 38098 ssh2: RSA SHA256:mTOoaYeQTbHi+nGlTU5DaEE2rWDOISeRciaPGBkGOUE
Oct 01 13:04:32 np0005464591.novalocal systemd-logind[791]: New session 3 of user zuul.
Oct 01 13:04:32 np0005464591.novalocal systemd[1]: Started Session 3 of User zuul.
Oct 01 13:04:32 np0005464591.novalocal sshd-session[4067]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 01 13:04:32 np0005464591.novalocal sudo[4146]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mysijhuncmwrhtltorvnmprxucxflvsr ; OS_CLOUD=vexxhost /usr/bin/python3'
Oct 01 13:04:32 np0005464591.novalocal sudo[4146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:04:32 np0005464591.novalocal python3[4148]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 01 13:04:32 np0005464591.novalocal sudo[4146]: pam_unix(sudo:session): session closed for user root
Oct 01 13:04:32 np0005464591.novalocal sudo[4219]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-icjjfyvdfqzyuxmaynrjavbzuezuleqp ; OS_CLOUD=vexxhost /usr/bin/python3'
Oct 01 13:04:32 np0005464591.novalocal sudo[4219]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:04:33 np0005464591.novalocal python3[4221]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759323872.3874495-312-119000412233763/source _original_basename=tmp6w_t6jm3 follow=False checksum=74bbc88d5006486575c534acedb83a390ebf92b4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:04:33 np0005464591.novalocal sudo[4219]: pam_unix(sudo:session): session closed for user root
Oct 01 13:04:36 np0005464591.novalocal sshd-session[4070]: Connection closed by 38.102.83.114 port 38098
Oct 01 13:04:36 np0005464591.novalocal sshd-session[4067]: pam_unix(sshd:session): session closed for user zuul
Oct 01 13:04:36 np0005464591.novalocal systemd[1]: session-3.scope: Deactivated successfully.
Oct 01 13:04:36 np0005464591.novalocal systemd-logind[791]: Session 3 logged out. Waiting for processes to exit.
Oct 01 13:04:36 np0005464591.novalocal systemd-logind[791]: Removed session 3.
Oct 01 13:05:08 np0005464591.novalocal sshd-session[4246]: Received disconnect from 193.46.255.99 port 17994:11:  [preauth]
Oct 01 13:05:08 np0005464591.novalocal sshd-session[4246]: Disconnected from authenticating user root 193.46.255.99 port 17994 [preauth]
Oct 01 13:07:06 np0005464591.novalocal systemd[1077]: Created slice User Background Tasks Slice.
Oct 01 13:07:06 np0005464591.novalocal systemd[1077]: Starting Cleanup of User's Temporary Files and Directories...
Oct 01 13:07:06 np0005464591.novalocal systemd[1077]: Finished Cleanup of User's Temporary Files and Directories.
Oct 01 13:10:25 np0005464591.novalocal sshd-session[4252]: Accepted publickey for zuul from 38.102.83.114 port 54728 ssh2: RSA SHA256:mTOoaYeQTbHi+nGlTU5DaEE2rWDOISeRciaPGBkGOUE
Oct 01 13:10:25 np0005464591.novalocal systemd-logind[791]: New session 4 of user zuul.
Oct 01 13:10:25 np0005464591.novalocal systemd[1]: Started Session 4 of User zuul.
Oct 01 13:10:25 np0005464591.novalocal sshd-session[4252]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 01 13:10:25 np0005464591.novalocal sudo[4279]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdjlyabdsowywqoqrepglrfkbveigrfj ; /usr/bin/python3'
Oct 01 13:10:25 np0005464591.novalocal sudo[4279]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:10:26 np0005464591.novalocal python3[4281]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-7409-d469-000000001cf5-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:10:26 np0005464591.novalocal sudo[4279]: pam_unix(sudo:session): session closed for user root
Oct 01 13:10:26 np0005464591.novalocal sudo[4308]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zcvqnzibdluoxmkxrplkfwhbvyfatlim ; /usr/bin/python3'
Oct 01 13:10:26 np0005464591.novalocal sudo[4308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:10:26 np0005464591.novalocal python3[4310]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:10:26 np0005464591.novalocal sudo[4308]: pam_unix(sudo:session): session closed for user root
Oct 01 13:10:26 np0005464591.novalocal sudo[4334]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwkkxvlmfzacxzvxpscomfcuoicvxgfu ; /usr/bin/python3'
Oct 01 13:10:26 np0005464591.novalocal sudo[4334]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:10:27 np0005464591.novalocal python3[4336]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:10:27 np0005464591.novalocal sudo[4334]: pam_unix(sudo:session): session closed for user root
Oct 01 13:10:27 np0005464591.novalocal sudo[4360]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhtwntkptkxntwhuymwnqicsbaomtyyo ; /usr/bin/python3'
Oct 01 13:10:27 np0005464591.novalocal sudo[4360]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:10:27 np0005464591.novalocal python3[4362]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:10:27 np0005464591.novalocal sudo[4360]: pam_unix(sudo:session): session closed for user root
Oct 01 13:10:27 np0005464591.novalocal sudo[4386]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zjsdnfswbyufumjrkwjtamzaqhvsmrsy ; /usr/bin/python3'
Oct 01 13:10:27 np0005464591.novalocal sudo[4386]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:10:27 np0005464591.novalocal python3[4388]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:10:27 np0005464591.novalocal sudo[4386]: pam_unix(sudo:session): session closed for user root
Oct 01 13:10:28 np0005464591.novalocal sudo[4412]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yufjivdgurwxgjchnnhafzvgcuqicqhh ; /usr/bin/python3'
Oct 01 13:10:28 np0005464591.novalocal sudo[4412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:10:28 np0005464591.novalocal python3[4414]: ansible-ansible.builtin.lineinfile Invoked with path=/etc/systemd/system.conf regexp=^#DefaultIOAccounting=no line=DefaultIOAccounting=yes state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:10:28 np0005464591.novalocal python3[4414]: ansible-ansible.builtin.lineinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Oct 01 13:10:28 np0005464591.novalocal sudo[4412]: pam_unix(sudo:session): session closed for user root
Oct 01 13:10:28 np0005464591.novalocal sudo[4438]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wuthwluvlyhpepotgutlfrezyioofdkp ; /usr/bin/python3'
Oct 01 13:10:28 np0005464591.novalocal sudo[4438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:10:29 np0005464591.novalocal python3[4440]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 01 13:10:29 np0005464591.novalocal systemd[1]: Reloading.
Oct 01 13:10:29 np0005464591.novalocal systemd-rc-local-generator[4461]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:10:29 np0005464591.novalocal sudo[4438]: pam_unix(sudo:session): session closed for user root
Oct 01 13:10:30 np0005464591.novalocal sudo[4494]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ekbbajwuunlnnbpaeccxcodfscnpgxec ; /usr/bin/python3'
Oct 01 13:10:30 np0005464591.novalocal sudo[4494]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:10:30 np0005464591.novalocal python3[4496]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Oct 01 13:10:30 np0005464591.novalocal sudo[4494]: pam_unix(sudo:session): session closed for user root
Oct 01 13:10:31 np0005464591.novalocal sudo[4520]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixngdawrsvjorotltwvlmntujnkmfexh ; /usr/bin/python3'
Oct 01 13:10:31 np0005464591.novalocal sudo[4520]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:10:31 np0005464591.novalocal python3[4522]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:10:31 np0005464591.novalocal sudo[4520]: pam_unix(sudo:session): session closed for user root
Oct 01 13:10:31 np0005464591.novalocal sudo[4548]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hnacqeyangeawjdccyvkbhosgjvnxvsk ; /usr/bin/python3'
Oct 01 13:10:31 np0005464591.novalocal sudo[4548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:10:31 np0005464591.novalocal python3[4550]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:10:31 np0005464591.novalocal sudo[4548]: pam_unix(sudo:session): session closed for user root
Oct 01 13:10:31 np0005464591.novalocal sudo[4576]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-beinrhjlxdynsgfsxzvmnvvcilznhihy ; /usr/bin/python3'
Oct 01 13:10:31 np0005464591.novalocal sudo[4576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:10:31 np0005464591.novalocal python3[4578]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:10:31 np0005464591.novalocal sudo[4576]: pam_unix(sudo:session): session closed for user root
Oct 01 13:10:31 np0005464591.novalocal sudo[4604]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ryvjwyuttlofdugmfetmyrkptnbdcqzn ; /usr/bin/python3'
Oct 01 13:10:31 np0005464591.novalocal sudo[4604]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:10:31 np0005464591.novalocal python3[4606]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:10:31 np0005464591.novalocal sudo[4604]: pam_unix(sudo:session): session closed for user root
Oct 01 13:10:32 np0005464591.novalocal python3[4633]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-7409-d469-000000001cfb-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:10:33 np0005464591.novalocal python3[4663]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 01 13:10:34 np0005464591.novalocal sshd-session[4255]: Connection closed by 38.102.83.114 port 54728
Oct 01 13:10:34 np0005464591.novalocal sshd-session[4252]: pam_unix(sshd:session): session closed for user zuul
Oct 01 13:10:34 np0005464591.novalocal systemd[1]: session-4.scope: Deactivated successfully.
Oct 01 13:10:34 np0005464591.novalocal systemd[1]: session-4.scope: Consumed 3.389s CPU time.
Oct 01 13:10:34 np0005464591.novalocal systemd-logind[791]: Session 4 logged out. Waiting for processes to exit.
Oct 01 13:10:34 np0005464591.novalocal systemd-logind[791]: Removed session 4.
Oct 01 13:10:36 np0005464591.novalocal irqbalance[786]: Cannot change IRQ 27 affinity: Operation not permitted
Oct 01 13:10:36 np0005464591.novalocal irqbalance[786]: IRQ 27 affinity is now unmanaged
Oct 01 13:10:36 np0005464591.novalocal sshd-session[4668]: Accepted publickey for zuul from 38.102.83.114 port 47988 ssh2: RSA SHA256:mTOoaYeQTbHi+nGlTU5DaEE2rWDOISeRciaPGBkGOUE
Oct 01 13:10:36 np0005464591.novalocal systemd-logind[791]: New session 5 of user zuul.
Oct 01 13:10:36 np0005464591.novalocal systemd[1]: Started Session 5 of User zuul.
Oct 01 13:10:36 np0005464591.novalocal sshd-session[4668]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 01 13:10:36 np0005464591.novalocal sudo[4695]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njmtdtwcdkrsqjqcytyfgwubdwqouxmb ; /usr/bin/python3'
Oct 01 13:10:36 np0005464591.novalocal sudo[4695]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:10:37 np0005464591.novalocal python3[4697]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Oct 01 13:10:56 np0005464591.novalocal sshd-session[4720]: Connection closed by authenticating user root 80.94.95.116 port 32136 [preauth]
Oct 01 13:11:11 np0005464591.novalocal kernel: SELinux:  Converting 364 SID table entries...
Oct 01 13:11:11 np0005464591.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Oct 01 13:11:11 np0005464591.novalocal kernel: SELinux:  policy capability open_perms=1
Oct 01 13:11:11 np0005464591.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Oct 01 13:11:11 np0005464591.novalocal kernel: SELinux:  policy capability always_check_network=0
Oct 01 13:11:11 np0005464591.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 01 13:11:11 np0005464591.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 01 13:11:11 np0005464591.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 01 13:11:21 np0005464591.novalocal kernel: SELinux:  Converting 364 SID table entries...
Oct 01 13:11:21 np0005464591.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Oct 01 13:11:21 np0005464591.novalocal kernel: SELinux:  policy capability open_perms=1
Oct 01 13:11:21 np0005464591.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Oct 01 13:11:21 np0005464591.novalocal kernel: SELinux:  policy capability always_check_network=0
Oct 01 13:11:21 np0005464591.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 01 13:11:21 np0005464591.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 01 13:11:21 np0005464591.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 01 13:11:30 np0005464591.novalocal kernel: SELinux:  Converting 364 SID table entries...
Oct 01 13:11:30 np0005464591.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Oct 01 13:11:30 np0005464591.novalocal kernel: SELinux:  policy capability open_perms=1
Oct 01 13:11:30 np0005464591.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Oct 01 13:11:30 np0005464591.novalocal kernel: SELinux:  policy capability always_check_network=0
Oct 01 13:11:30 np0005464591.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 01 13:11:30 np0005464591.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 01 13:11:30 np0005464591.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 01 13:11:32 np0005464591.novalocal setsebool[4782]: The virt_use_nfs policy boolean was changed to 1 by root
Oct 01 13:11:32 np0005464591.novalocal setsebool[4782]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Oct 01 13:11:45 np0005464591.novalocal kernel: SELinux:  Converting 367 SID table entries...
Oct 01 13:11:45 np0005464591.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Oct 01 13:11:45 np0005464591.novalocal kernel: SELinux:  policy capability open_perms=1
Oct 01 13:11:45 np0005464591.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Oct 01 13:11:45 np0005464591.novalocal kernel: SELinux:  policy capability always_check_network=0
Oct 01 13:11:45 np0005464591.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 01 13:11:45 np0005464591.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 01 13:11:45 np0005464591.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 01 13:11:48 np0005464591.novalocal sshd-session[4805]: Received disconnect from 91.224.92.108 port 22602:11:  [preauth]
Oct 01 13:11:48 np0005464591.novalocal sshd-session[4805]: Disconnected from authenticating user root 91.224.92.108 port 22602 [preauth]
Oct 01 13:12:07 np0005464591.novalocal dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Oct 01 13:12:07 np0005464591.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 01 13:12:07 np0005464591.novalocal systemd[1]: Starting man-db-cache-update.service...
Oct 01 13:12:07 np0005464591.novalocal systemd[1]: Reloading.
Oct 01 13:12:07 np0005464591.novalocal systemd-rc-local-generator[5537]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:12:08 np0005464591.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Oct 01 13:12:11 np0005464591.novalocal systemd[1]: Starting PackageKit Daemon...
Oct 01 13:12:11 np0005464591.novalocal PackageKit[6912]: daemon start
Oct 01 13:12:11 np0005464591.novalocal systemd[1]: Starting Authorization Manager...
Oct 01 13:12:11 np0005464591.novalocal polkitd[6977]: Started polkitd version 0.117
Oct 01 13:12:11 np0005464591.novalocal polkitd[6977]: Loading rules from directory /etc/polkit-1/rules.d
Oct 01 13:12:11 np0005464591.novalocal polkitd[6977]: Loading rules from directory /usr/share/polkit-1/rules.d
Oct 01 13:12:11 np0005464591.novalocal polkitd[6977]: Finished loading, compiling and executing 3 rules
Oct 01 13:12:11 np0005464591.novalocal polkitd[6977]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Oct 01 13:12:11 np0005464591.novalocal systemd[1]: Started Authorization Manager.
Oct 01 13:12:11 np0005464591.novalocal systemd[1]: Started PackageKit Daemon.
Oct 01 13:12:13 np0005464591.novalocal sudo[4695]: pam_unix(sudo:session): session closed for user root
Oct 01 13:12:16 np0005464591.novalocal python3[9262]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-0bfe-4281-00000000000b-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:12:16 np0005464591.novalocal kernel: evm: overlay not supported
Oct 01 13:12:16 np0005464591.novalocal systemd[1077]: Starting D-Bus User Message Bus...
Oct 01 13:12:16 np0005464591.novalocal dbus-broker-launch[9922]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Oct 01 13:12:16 np0005464591.novalocal dbus-broker-launch[9922]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Oct 01 13:12:16 np0005464591.novalocal systemd[1077]: Started D-Bus User Message Bus.
Oct 01 13:12:16 np0005464591.novalocal dbus-broker-lau[9922]: Ready
Oct 01 13:12:16 np0005464591.novalocal systemd[1077]: selinux: avc:  op=load_policy lsm=selinux seqno=6 res=1
Oct 01 13:12:16 np0005464591.novalocal systemd[1077]: Created slice Slice /user.
Oct 01 13:12:16 np0005464591.novalocal systemd[1077]: podman-9801.scope: unit configures an IP firewall, but not running as root.
Oct 01 13:12:16 np0005464591.novalocal systemd[1077]: (This warning is only shown for the first unit using IP firewalling.)
Oct 01 13:12:16 np0005464591.novalocal systemd[1077]: Started podman-9801.scope.
Oct 01 13:12:17 np0005464591.novalocal systemd[1077]: Started podman-pause-90292730.scope.
Oct 01 13:12:17 np0005464591.novalocal sudo[10566]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ynhmtyoicqphzggwkjpahltslvmlegnu ; /usr/bin/python3'
Oct 01 13:12:17 np0005464591.novalocal sudo[10566]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:12:18 np0005464591.novalocal python3[10568]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]
                                                       location = "38.102.83.30:5001"
                                                       insecure = true path=/etc/containers/registries.conf block=[[registry]]
                                                       location = "38.102.83.30:5001"
                                                       insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:12:18 np0005464591.novalocal sudo[10566]: pam_unix(sudo:session): session closed for user root
Oct 01 13:12:18 np0005464591.novalocal sshd-session[4671]: Connection closed by 38.102.83.114 port 47988
Oct 01 13:12:18 np0005464591.novalocal sshd-session[4668]: pam_unix(sshd:session): session closed for user zuul
Oct 01 13:12:18 np0005464591.novalocal systemd[1]: session-5.scope: Deactivated successfully.
Oct 01 13:12:18 np0005464591.novalocal systemd[1]: session-5.scope: Consumed 1min 12.952s CPU time.
Oct 01 13:12:18 np0005464591.novalocal systemd-logind[791]: Session 5 logged out. Waiting for processes to exit.
Oct 01 13:12:18 np0005464591.novalocal systemd-logind[791]: Removed session 5.
Oct 01 13:12:39 np0005464591.novalocal sshd-session[16323]: Connection closed by 38.102.83.243 port 50666 [preauth]
Oct 01 13:12:39 np0005464591.novalocal sshd-session[16327]: Unable to negotiate with 38.102.83.243 port 50678: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Oct 01 13:12:39 np0005464591.novalocal sshd-session[16328]: Connection closed by 38.102.83.243 port 50650 [preauth]
Oct 01 13:12:39 np0005464591.novalocal sshd-session[16330]: Unable to negotiate with 38.102.83.243 port 50680: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Oct 01 13:12:39 np0005464591.novalocal sshd-session[16332]: Unable to negotiate with 38.102.83.243 port 50686: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Oct 01 13:12:44 np0005464591.novalocal sshd-session[17616]: Accepted publickey for zuul from 38.102.83.114 port 53878 ssh2: RSA SHA256:mTOoaYeQTbHi+nGlTU5DaEE2rWDOISeRciaPGBkGOUE
Oct 01 13:12:44 np0005464591.novalocal systemd-logind[791]: New session 6 of user zuul.
Oct 01 13:12:44 np0005464591.novalocal systemd[1]: Started Session 6 of User zuul.
Oct 01 13:12:44 np0005464591.novalocal sshd-session[17616]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 01 13:12:44 np0005464591.novalocal python3[17691]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHXrWYyf3oKUiHHLYLUt7lNyfnT+aZy+k9HbrtX7uQjvW1JHdUFb5ik1lguWaAi3xvSWkTt6pb63u0zVccqskhE= zuul@np0005464590.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 01 13:12:44 np0005464591.novalocal sudo[17819]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfadgztzocvmrixrrilvnnzkobmeaxjv ; /usr/bin/python3'
Oct 01 13:12:44 np0005464591.novalocal sudo[17819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:12:44 np0005464591.novalocal python3[17828]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHXrWYyf3oKUiHHLYLUt7lNyfnT+aZy+k9HbrtX7uQjvW1JHdUFb5ik1lguWaAi3xvSWkTt6pb63u0zVccqskhE= zuul@np0005464590.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 01 13:12:45 np0005464591.novalocal sudo[17819]: pam_unix(sudo:session): session closed for user root
Oct 01 13:12:45 np0005464591.novalocal sudo[18121]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sglapeyudrrjrdqomabethwhyiubqnbg ; /usr/bin/python3'
Oct 01 13:12:45 np0005464591.novalocal sudo[18121]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:12:45 np0005464591.novalocal python3[18131]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005464591.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Oct 01 13:12:45 np0005464591.novalocal useradd[18177]: new group: name=cloud-admin, GID=1002
Oct 01 13:12:45 np0005464591.novalocal useradd[18177]: new user: name=cloud-admin, UID=1002, GID=1002, home=/home/cloud-admin, shell=/bin/bash, from=none
Oct 01 13:12:46 np0005464591.novalocal sudo[18121]: pam_unix(sudo:session): session closed for user root
Oct 01 13:12:46 np0005464591.novalocal sudo[18311]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lpsicqtqefmjiazlfwwyekqlrofbaxzj ; /usr/bin/python3'
Oct 01 13:12:46 np0005464591.novalocal sudo[18311]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:12:46 np0005464591.novalocal python3[18317]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHXrWYyf3oKUiHHLYLUt7lNyfnT+aZy+k9HbrtX7uQjvW1JHdUFb5ik1lguWaAi3xvSWkTt6pb63u0zVccqskhE= zuul@np0005464590.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct 01 13:12:46 np0005464591.novalocal sudo[18311]: pam_unix(sudo:session): session closed for user root
Oct 01 13:12:46 np0005464591.novalocal sudo[18511]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-myndrcbbfymmtnhulucsubcuiciibbjx ; /usr/bin/python3'
Oct 01 13:12:46 np0005464591.novalocal sudo[18511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:12:46 np0005464591.novalocal python3[18517]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 01 13:12:46 np0005464591.novalocal sudo[18511]: pam_unix(sudo:session): session closed for user root
Oct 01 13:12:47 np0005464591.novalocal sudo[18733]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xokpdysqisamllukuwahwnwfydjledzj ; /usr/bin/python3'
Oct 01 13:12:47 np0005464591.novalocal sudo[18733]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:12:47 np0005464591.novalocal python3[18739]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1759324366.7042136-151-276897028359659/source _original_basename=tmpgupohui9 follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:12:47 np0005464591.novalocal sudo[18733]: pam_unix(sudo:session): session closed for user root
Oct 01 13:12:48 np0005464591.novalocal sudo[18977]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xghthrxbksrcscrkdovvyglbdznmamso ; /usr/bin/python3'
Oct 01 13:12:48 np0005464591.novalocal sudo[18977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:12:48 np0005464591.novalocal python3[18984]: ansible-ansible.builtin.hostname Invoked with name=compute-0 use=systemd
Oct 01 13:12:48 np0005464591.novalocal systemd[1]: Starting Hostname Service...
Oct 01 13:12:48 np0005464591.novalocal systemd[1]: Started Hostname Service.
Oct 01 13:12:48 np0005464591.novalocal systemd-hostnamed[19060]: Changed pretty hostname to 'compute-0'
Oct 01 13:12:48 compute-0 systemd-hostnamed[19060]: Hostname set to <compute-0> (static)
Oct 01 13:12:48 compute-0 NetworkManager[3964]: <info>  [1759324368.6220] hostname: static hostname changed from "np0005464591.novalocal" to "compute-0"
Oct 01 13:12:48 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct 01 13:12:48 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct 01 13:12:48 compute-0 sudo[18977]: pam_unix(sudo:session): session closed for user root
Oct 01 13:12:48 compute-0 sshd-session[17647]: Connection closed by 38.102.83.114 port 53878
Oct 01 13:12:48 compute-0 sshd-session[17616]: pam_unix(sshd:session): session closed for user zuul
Oct 01 13:12:48 compute-0 systemd[1]: session-6.scope: Deactivated successfully.
Oct 01 13:12:48 compute-0 systemd[1]: session-6.scope: Consumed 2.616s CPU time.
Oct 01 13:12:48 compute-0 systemd-logind[791]: Session 6 logged out. Waiting for processes to exit.
Oct 01 13:12:48 compute-0 systemd-logind[791]: Removed session 6.
Oct 01 13:12:58 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct 01 13:13:13 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct 01 13:13:13 compute-0 systemd[1]: Finished man-db-cache-update.service.
Oct 01 13:13:13 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1min 12.018s CPU time.
Oct 01 13:13:13 compute-0 systemd[1]: run-rfedd29560a044d30bbdd7cc53fb57430.service: Deactivated successfully.
Oct 01 13:13:18 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct 01 13:16:06 compute-0 systemd[1]: Starting Cleanup of Temporary Directories...
Oct 01 13:16:06 compute-0 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Oct 01 13:16:06 compute-0 systemd[1]: Finished Cleanup of Temporary Directories.
Oct 01 13:16:06 compute-0 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Oct 01 13:17:03 compute-0 sshd-session[26578]: Accepted publickey for zuul from 38.102.83.243 port 38408 ssh2: RSA SHA256:mTOoaYeQTbHi+nGlTU5DaEE2rWDOISeRciaPGBkGOUE
Oct 01 13:17:03 compute-0 systemd-logind[791]: New session 7 of user zuul.
Oct 01 13:17:03 compute-0 systemd[1]: Started Session 7 of User zuul.
Oct 01 13:17:03 compute-0 sshd-session[26578]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 01 13:17:04 compute-0 python3[26654]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 01 13:17:05 compute-0 sudo[26768]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntbupaqjgvpqbxxbegcabnbbdvemiwhv ; /usr/bin/python3'
Oct 01 13:17:05 compute-0 sudo[26768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:17:05 compute-0 python3[26770]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 01 13:17:05 compute-0 sudo[26768]: pam_unix(sudo:session): session closed for user root
Oct 01 13:17:06 compute-0 sudo[26841]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-erajmowfpocfqeyqlleetgdlrggbrvoj ; /usr/bin/python3'
Oct 01 13:17:06 compute-0 sudo[26841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:17:06 compute-0 python3[26843]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759324625.639242-30453-110132069280967/source mode=0755 _original_basename=delorean.repo follow=False checksum=b5dfbbfb98e78ea01644984f7921e3b97b672f66 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:17:06 compute-0 sudo[26841]: pam_unix(sudo:session): session closed for user root
Oct 01 13:17:06 compute-0 sudo[26867]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pmdzlheiyjrhzuarmdbjpwbntkzdqnpz ; /usr/bin/python3'
Oct 01 13:17:06 compute-0 sudo[26867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:17:06 compute-0 python3[26869]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-master-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 01 13:17:06 compute-0 sudo[26867]: pam_unix(sudo:session): session closed for user root
Oct 01 13:17:06 compute-0 sudo[26940]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdxkdtdwtylcoiftzwkemydhdczwwpqh ; /usr/bin/python3'
Oct 01 13:17:06 compute-0 sudo[26940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:17:07 compute-0 python3[26942]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759324625.639242-30453-110132069280967/source mode=0755 _original_basename=delorean-master-testing.repo follow=False checksum=c22157e85d05af7ffbafa054f80958446d397a41 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:17:07 compute-0 sudo[26940]: pam_unix(sudo:session): session closed for user root
Oct 01 13:17:07 compute-0 sudo[26966]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owiqmfqqddrtohlnsgvsmdmrttparnhb ; /usr/bin/python3'
Oct 01 13:17:07 compute-0 sudo[26966]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:17:07 compute-0 python3[26968]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 01 13:17:07 compute-0 sudo[26966]: pam_unix(sudo:session): session closed for user root
Oct 01 13:17:07 compute-0 sudo[27039]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfbhlhgvdkwlteahdjoboqcdcwmroffh ; /usr/bin/python3'
Oct 01 13:17:07 compute-0 sudo[27039]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:17:07 compute-0 python3[27041]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759324625.639242-30453-110132069280967/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:17:07 compute-0 sudo[27039]: pam_unix(sudo:session): session closed for user root
Oct 01 13:17:07 compute-0 sudo[27065]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pahmrxevqsntjctfqlzzaevmqmpxfhsu ; /usr/bin/python3'
Oct 01 13:17:07 compute-0 sudo[27065]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:17:07 compute-0 python3[27067]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 01 13:17:07 compute-0 sudo[27065]: pam_unix(sudo:session): session closed for user root
Oct 01 13:17:08 compute-0 sudo[27138]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odpqqszhqafquyxktwjydiowklxymwtr ; /usr/bin/python3'
Oct 01 13:17:08 compute-0 sudo[27138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:17:08 compute-0 python3[27140]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759324625.639242-30453-110132069280967/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:17:08 compute-0 sudo[27138]: pam_unix(sudo:session): session closed for user root
Oct 01 13:17:08 compute-0 sudo[27164]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvvytmgrcuyjkinlaqcfjovzjssxsgjf ; /usr/bin/python3'
Oct 01 13:17:08 compute-0 sudo[27164]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:17:08 compute-0 python3[27166]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 01 13:17:08 compute-0 sudo[27164]: pam_unix(sudo:session): session closed for user root
Oct 01 13:17:08 compute-0 sudo[27237]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pphfsakxngbfhiyanqtjachatpztpfdb ; /usr/bin/python3'
Oct 01 13:17:08 compute-0 sudo[27237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:17:08 compute-0 python3[27239]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759324625.639242-30453-110132069280967/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:17:08 compute-0 sudo[27237]: pam_unix(sudo:session): session closed for user root
Oct 01 13:17:09 compute-0 sudo[27263]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axhneaplpvlgmyibodseopvwwemwzpmo ; /usr/bin/python3'
Oct 01 13:17:09 compute-0 sudo[27263]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:17:09 compute-0 python3[27265]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 01 13:17:09 compute-0 sudo[27263]: pam_unix(sudo:session): session closed for user root
Oct 01 13:17:09 compute-0 sudo[27336]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyunvwfakdpmhpkpjsfwxsycuyestyvy ; /usr/bin/python3'
Oct 01 13:17:09 compute-0 sudo[27336]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:17:09 compute-0 python3[27338]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759324625.639242-30453-110132069280967/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:17:09 compute-0 sudo[27336]: pam_unix(sudo:session): session closed for user root
Oct 01 13:17:09 compute-0 sudo[27362]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gdcmwwfubrgdjmhcpsuwappsyxgtgato ; /usr/bin/python3'
Oct 01 13:17:09 compute-0 sudo[27362]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:17:09 compute-0 python3[27364]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 01 13:17:09 compute-0 sudo[27362]: pam_unix(sudo:session): session closed for user root
Oct 01 13:17:10 compute-0 sudo[27435]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gihqbghalfbksuharqsgvtewitqrfdoe ; /usr/bin/python3'
Oct 01 13:17:10 compute-0 sudo[27435]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:17:10 compute-0 python3[27437]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759324625.639242-30453-110132069280967/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=47e4cafd9254eb43513acdd3d3daa33c50d726c1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:17:10 compute-0 sudo[27435]: pam_unix(sudo:session): session closed for user root
Oct 01 13:17:10 compute-0 sudo[27461]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbfsnjtefteyfiixympsvhhzcxzlcjsr ; /usr/bin/python3'
Oct 01 13:17:10 compute-0 sudo[27461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:17:10 compute-0 python3[27463]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/gating.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct 01 13:17:10 compute-0 sudo[27461]: pam_unix(sudo:session): session closed for user root
Oct 01 13:17:10 compute-0 sudo[27534]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tublfackedimjjfbruragyniqqsrxyoz ; /usr/bin/python3'
Oct 01 13:17:10 compute-0 sudo[27534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:17:11 compute-0 python3[27536]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759324625.639242-30453-110132069280967/source mode=0755 _original_basename=gating.repo follow=False checksum=c5b62a3bac5198fa0e7af5b3084c82dfdf674f98 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:17:11 compute-0 sudo[27534]: pam_unix(sudo:session): session closed for user root
Oct 01 13:17:14 compute-0 sshd-session[27561]: Connection closed by 192.168.122.11 port 42902 [preauth]
Oct 01 13:17:14 compute-0 sshd-session[27566]: Unable to negotiate with 192.168.122.11 port 42936: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Oct 01 13:17:14 compute-0 sshd-session[27562]: Connection closed by 192.168.122.11 port 42908 [preauth]
Oct 01 13:17:14 compute-0 sshd-session[27563]: Unable to negotiate with 192.168.122.11 port 42912: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Oct 01 13:17:14 compute-0 sshd-session[27564]: Unable to negotiate with 192.168.122.11 port 42924: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Oct 01 13:17:17 compute-0 PackageKit[6912]: daemon quit
Oct 01 13:17:17 compute-0 systemd[1]: packagekit.service: Deactivated successfully.
Oct 01 13:18:23 compute-0 python3[27594]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:18:49 compute-0 sshd-session[27596]: Received disconnect from 193.46.255.7 port 40432:11:  [preauth]
Oct 01 13:18:49 compute-0 sshd-session[27596]: Disconnected from authenticating user root 193.46.255.7 port 40432 [preauth]
Oct 01 13:22:58 compute-0 sshd-session[27602]: banner exchange: Connection from 64.62.197.107 port 41784: invalid format
Oct 01 13:22:59 compute-0 sshd-session[27600]: Invalid user onlime_r from 185.156.73.233 port 63514
Oct 01 13:22:59 compute-0 sshd-session[27600]: Connection closed by invalid user onlime_r 185.156.73.233 port 63514 [preauth]
Oct 01 13:23:22 compute-0 sshd-session[26581]: Received disconnect from 38.102.83.243 port 38408:11: disconnected by user
Oct 01 13:23:22 compute-0 sshd-session[26581]: Disconnected from user zuul 38.102.83.243 port 38408
Oct 01 13:23:22 compute-0 sshd-session[26578]: pam_unix(sshd:session): session closed for user zuul
Oct 01 13:23:22 compute-0 systemd[1]: session-7.scope: Deactivated successfully.
Oct 01 13:23:22 compute-0 systemd[1]: session-7.scope: Consumed 5.877s CPU time.
Oct 01 13:23:22 compute-0 systemd-logind[791]: Session 7 logged out. Waiting for processes to exit.
Oct 01 13:23:22 compute-0 systemd-logind[791]: Removed session 7.
Oct 01 13:25:57 compute-0 sshd-session[27605]: Received disconnect from 80.94.93.233 port 37790:11:  [preauth]
Oct 01 13:25:57 compute-0 sshd-session[27605]: Disconnected from authenticating user root 80.94.93.233 port 37790 [preauth]
Oct 01 13:30:23 compute-0 sshd-session[27609]: Accepted publickey for zuul from 192.168.122.30 port 37554 ssh2: ECDSA SHA256:G/wBH4NemtaB5A4Xrsc6R+GZmi6HC8VbviS/FKhdd8M
Oct 01 13:30:23 compute-0 systemd-logind[791]: New session 8 of user zuul.
Oct 01 13:30:23 compute-0 systemd[1]: Started Session 8 of User zuul.
Oct 01 13:30:23 compute-0 sshd-session[27609]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 01 13:30:25 compute-0 python3.9[27762]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 01 13:30:26 compute-0 sudo[27941]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezjmrzrygjehvkbzqslebxwwydvdmhwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325425.8724818-44-188817529950440/AnsiballZ_command.py'
Oct 01 13:30:26 compute-0 sudo[27941]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:30:26 compute-0 python3.9[27943]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:30:33 compute-0 sudo[27941]: pam_unix(sudo:session): session closed for user root
Oct 01 13:30:34 compute-0 sshd-session[27612]: Connection closed by 192.168.122.30 port 37554
Oct 01 13:30:34 compute-0 sshd-session[27609]: pam_unix(sshd:session): session closed for user zuul
Oct 01 13:30:34 compute-0 systemd[1]: session-8.scope: Deactivated successfully.
Oct 01 13:30:34 compute-0 systemd[1]: session-8.scope: Consumed 8.014s CPU time.
Oct 01 13:30:34 compute-0 systemd-logind[791]: Session 8 logged out. Waiting for processes to exit.
Oct 01 13:30:34 compute-0 systemd-logind[791]: Removed session 8.
Oct 01 13:30:39 compute-0 sshd-session[28001]: Accepted publickey for zuul from 192.168.122.30 port 60868 ssh2: ECDSA SHA256:G/wBH4NemtaB5A4Xrsc6R+GZmi6HC8VbviS/FKhdd8M
Oct 01 13:30:39 compute-0 systemd-logind[791]: New session 9 of user zuul.
Oct 01 13:30:39 compute-0 systemd[1]: Started Session 9 of User zuul.
Oct 01 13:30:39 compute-0 sshd-session[28001]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 01 13:30:41 compute-0 python3.9[28154]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 01 13:30:41 compute-0 sshd-session[28004]: Connection closed by 192.168.122.30 port 60868
Oct 01 13:30:41 compute-0 sshd-session[28001]: pam_unix(sshd:session): session closed for user zuul
Oct 01 13:30:41 compute-0 systemd[1]: session-9.scope: Deactivated successfully.
Oct 01 13:30:41 compute-0 systemd-logind[791]: Session 9 logged out. Waiting for processes to exit.
Oct 01 13:30:41 compute-0 systemd-logind[791]: Removed session 9.
Oct 01 13:30:57 compute-0 sshd-session[28182]: Accepted publickey for zuul from 192.168.122.30 port 34100 ssh2: ECDSA SHA256:G/wBH4NemtaB5A4Xrsc6R+GZmi6HC8VbviS/FKhdd8M
Oct 01 13:30:57 compute-0 systemd-logind[791]: New session 10 of user zuul.
Oct 01 13:30:57 compute-0 systemd[1]: Started Session 10 of User zuul.
Oct 01 13:30:57 compute-0 sshd-session[28182]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 01 13:30:58 compute-0 python3.9[28335]: ansible-ansible.legacy.ping Invoked with data=pong
Oct 01 13:30:59 compute-0 python3.9[28509]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 01 13:31:00 compute-0 sudo[28659]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xpsespiwjqhbwqprzxtwzbcqharhbftt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325460.1247244-69-62709813587648/AnsiballZ_command.py'
Oct 01 13:31:00 compute-0 sudo[28659]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:31:00 compute-0 python3.9[28661]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:31:00 compute-0 sudo[28659]: pam_unix(sudo:session): session closed for user root
Oct 01 13:31:01 compute-0 sudo[28812]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cidfrbbmrlkqtlvstajdbxbfhixowgjz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325461.1683643-93-192186237783280/AnsiballZ_stat.py'
Oct 01 13:31:01 compute-0 sudo[28812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:31:01 compute-0 python3.9[28814]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 01 13:31:01 compute-0 sudo[28812]: pam_unix(sudo:session): session closed for user root
Oct 01 13:31:02 compute-0 sudo[28964]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uirllnwtymkpklxegdxjzywsdmllcllm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325462.1728334-109-200676181156797/AnsiballZ_file.py'
Oct 01 13:31:02 compute-0 sudo[28964]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:31:02 compute-0 python3.9[28966]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:31:02 compute-0 sudo[28964]: pam_unix(sudo:session): session closed for user root
Oct 01 13:31:03 compute-0 sudo[29116]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-epqmosmeaopymhszbirjxmrdmhqvgfut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325463.2238667-125-223276957789271/AnsiballZ_stat.py'
Oct 01 13:31:03 compute-0 sudo[29116]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:31:03 compute-0 python3.9[29118]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:31:03 compute-0 sudo[29116]: pam_unix(sudo:session): session closed for user root
Oct 01 13:31:04 compute-0 sudo[29239]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owevauzgahrgokznjmceemzmgyedtinf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325463.2238667-125-223276957789271/AnsiballZ_copy.py'
Oct 01 13:31:04 compute-0 sudo[29239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:31:04 compute-0 python3.9[29241]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1759325463.2238667-125-223276957789271/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:31:04 compute-0 sudo[29239]: pam_unix(sudo:session): session closed for user root
Oct 01 13:31:05 compute-0 sudo[29391]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmtanpifjyjxtrvwzbefogejscxybqtj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325464.6998024-155-263304633777141/AnsiballZ_setup.py'
Oct 01 13:31:05 compute-0 sudo[29391]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:31:05 compute-0 python3.9[29393]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 01 13:31:05 compute-0 sudo[29391]: pam_unix(sudo:session): session closed for user root
Oct 01 13:31:06 compute-0 sudo[29547]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzbfhxxvdbwoocljkzohdssgdpautvod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325465.770402-171-209276178259916/AnsiballZ_file.py'
Oct 01 13:31:06 compute-0 sudo[29547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:31:06 compute-0 python3.9[29549]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:31:06 compute-0 sudo[29547]: pam_unix(sudo:session): session closed for user root
Oct 01 13:31:07 compute-0 python3.9[29699]: ansible-ansible.builtin.service_facts Invoked
Oct 01 13:31:12 compute-0 python3.9[29954]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:31:13 compute-0 python3.9[30104]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 01 13:31:14 compute-0 python3.9[30258]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 01 13:31:15 compute-0 sudo[30414]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zejdbjuvzlryabkfqckiwjakwasngjad ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325474.957567-267-194393668886641/AnsiballZ_setup.py'
Oct 01 13:31:15 compute-0 sudo[30414]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:31:15 compute-0 python3.9[30416]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 01 13:31:15 compute-0 sudo[30414]: pam_unix(sudo:session): session closed for user root
Oct 01 13:31:16 compute-0 sudo[30498]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvxgythlajfvhsdilszutktahipsgpdg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325474.957567-267-194393668886641/AnsiballZ_dnf.py'
Oct 01 13:31:16 compute-0 sudo[30498]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:31:16 compute-0 python3.9[30500]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 01 13:31:59 compute-0 systemd[1]: Reloading.
Oct 01 13:31:59 compute-0 systemd-rc-local-generator[30690]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:31:59 compute-0 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Oct 01 13:31:59 compute-0 systemd[1]: Reloading.
Oct 01 13:31:59 compute-0 systemd-rc-local-generator[30737]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:31:59 compute-0 systemd[1]: Starting dnf makecache...
Oct 01 13:31:59 compute-0 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Oct 01 13:31:59 compute-0 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Oct 01 13:31:59 compute-0 systemd[1]: Reloading.
Oct 01 13:32:00 compute-0 dnf[30747]: Repository 'gating-repo' is missing name in configuration, using id.
Oct 01 13:32:00 compute-0 dnf[30747]: Failed determining last makecache time.
Oct 01 13:32:00 compute-0 dnf[30747]: delorean-openstack-barbican-42b4c41831408a8e323 148 kB/s | 3.0 kB     00:00
Oct 01 13:32:00 compute-0 systemd-rc-local-generator[30780]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:32:00 compute-0 dnf[30747]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 203 kB/s | 3.0 kB     00:00
Oct 01 13:32:00 compute-0 dnf[30747]: delorean-openstack-cinder-1c00d6490d88e436f26ef 188 kB/s | 3.0 kB     00:00
Oct 01 13:32:00 compute-0 dnf[30747]: delorean-python-stevedore-c4acc5639fd2329372142 189 kB/s | 3.0 kB     00:00
Oct 01 13:32:00 compute-0 dnf[30747]: delorean-python-cloudkitty-tests-tempest-3961dc 214 kB/s | 3.0 kB     00:00
Oct 01 13:32:00 compute-0 dnf[30747]: delorean-os-net-config-28598c2978b9e2207dd19fc4 165 kB/s | 3.0 kB     00:00
Oct 01 13:32:00 compute-0 dnf[30747]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 190 kB/s | 3.0 kB     00:00
Oct 01 13:32:00 compute-0 dnf[30747]: delorean-python-designate-tests-tempest-347fdbc 177 kB/s | 3.0 kB     00:00
Oct 01 13:32:00 compute-0 systemd[1]: Listening on LVM2 poll daemon socket.
Oct 01 13:32:00 compute-0 dnf[30747]: delorean-openstack-glance-1fd12c29b339f30fe823e 195 kB/s | 3.0 kB     00:00
Oct 01 13:32:00 compute-0 dnf[30747]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 194 kB/s | 3.0 kB     00:00
Oct 01 13:32:00 compute-0 dnf[30747]: delorean-openstack-manila-3c01b7181572c95dac462 195 kB/s | 3.0 kB     00:00
Oct 01 13:32:00 compute-0 dnf[30747]: delorean-python-whitebox-neutron-tests-tempest- 202 kB/s | 3.0 kB     00:00
Oct 01 13:32:00 compute-0 dnf[30747]: delorean-openstack-octavia-ba397f07a7331190208c 199 kB/s | 3.0 kB     00:00
Oct 01 13:32:00 compute-0 dnf[30747]: delorean-openstack-watcher-c014f81a8647287f6dcc 198 kB/s | 3.0 kB     00:00
Oct 01 13:32:00 compute-0 dnf[30747]: delorean-edpm-image-builder-55ba53cf215b14ed95b 199 kB/s | 3.0 kB     00:00
Oct 01 13:32:00 compute-0 dnf[30747]: delorean-puppet-ceph-b0c245ccde541a63fde0564366 200 kB/s | 3.0 kB     00:00
Oct 01 13:32:00 compute-0 dnf[30747]: delorean-openstack-swift-dc98a8463506ac520c469a 183 kB/s | 3.0 kB     00:00
Oct 01 13:32:00 compute-0 dnf[30747]: delorean-python-tempestconf-8515371b7cceebd4282 170 kB/s | 3.0 kB     00:00
Oct 01 13:32:00 compute-0 dnf[30747]: delorean-openstack-heat-ui-013accbfd179753bc3f0 182 kB/s | 3.0 kB     00:00
Oct 01 13:32:00 compute-0 dnf[30747]: gating-repo                                     246 kB/s | 1.5 kB     00:00
Oct 01 13:32:00 compute-0 dbus-broker-launch[748]: Noticed file-system modification, trigger reload.
Oct 01 13:32:00 compute-0 dbus-broker-launch[748]: Noticed file-system modification, trigger reload.
Oct 01 13:32:00 compute-0 dbus-broker-launch[748]: Noticed file-system modification, trigger reload.
Oct 01 13:32:00 compute-0 dnf[30747]: CentOS Stream 9 - BaseOS                         25 kB/s | 6.7 kB     00:00
Oct 01 13:32:00 compute-0 dnf[30747]: CentOS Stream 9 - AppStream                      64 kB/s | 6.8 kB     00:00
Oct 01 13:32:01 compute-0 dnf[30747]: CentOS Stream 9 - CRB                            66 kB/s | 6.6 kB     00:00
Oct 01 13:32:01 compute-0 dnf[30747]: CentOS Stream 9 - Extras packages                32 kB/s | 8.0 kB     00:00
Oct 01 13:32:01 compute-0 dnf[30747]: dlrn-antelope-testing                           194 kB/s | 3.0 kB     00:00
Oct 01 13:32:01 compute-0 dnf[30747]: dlrn-antelope-build-deps                        185 kB/s | 3.0 kB     00:00
Oct 01 13:32:01 compute-0 dnf[30747]: centos9-rabbitmq                                113 kB/s | 3.0 kB     00:00
Oct 01 13:32:01 compute-0 dnf[30747]: centos9-storage                                 121 kB/s | 3.0 kB     00:00
Oct 01 13:32:01 compute-0 dnf[30747]: centos9-opstools                                123 kB/s | 3.0 kB     00:00
Oct 01 13:32:01 compute-0 dnf[30747]: NFV SIG OpenvSwitch                             119 kB/s | 3.0 kB     00:00
Oct 01 13:32:01 compute-0 dnf[30747]: repo-setup-centos-appstream                     181 kB/s | 4.4 kB     00:00
Oct 01 13:32:01 compute-0 dnf[30747]: repo-setup-centos-baseos                        194 kB/s | 3.9 kB     00:00
Oct 01 13:32:01 compute-0 dnf[30747]: repo-setup-centos-highavailability              184 kB/s | 3.9 kB     00:00
Oct 01 13:32:01 compute-0 dnf[30747]: repo-setup-centos-powertools                    193 kB/s | 4.3 kB     00:00
Oct 01 13:32:01 compute-0 dnf[30747]: Extra Packages for Enterprise Linux 9 - x86_64  210 kB/s |  33 kB     00:00
Oct 01 13:32:02 compute-0 dnf[30747]: Metadata cache created.
Oct 01 13:32:02 compute-0 systemd[1]: dnf-makecache.service: Deactivated successfully.
Oct 01 13:32:02 compute-0 systemd[1]: Finished dnf makecache.
Oct 01 13:32:02 compute-0 systemd[1]: dnf-makecache.service: Consumed 1.645s CPU time.
Oct 01 13:32:37 compute-0 sshd-session[30967]: Received disconnect from 193.46.255.159 port 12080:11:  [preauth]
Oct 01 13:32:37 compute-0 sshd-session[30967]: Disconnected from authenticating user root 193.46.255.159 port 12080 [preauth]
Oct 01 13:33:01 compute-0 anacron[1070]: Job `cron.daily' started
Oct 01 13:33:01 compute-0 anacron[1070]: Job `cron.daily' terminated
Oct 01 13:33:08 compute-0 kernel: SELinux:  Converting 2715 SID table entries...
Oct 01 13:33:08 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Oct 01 13:33:08 compute-0 kernel: SELinux:  policy capability open_perms=1
Oct 01 13:33:08 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Oct 01 13:33:08 compute-0 kernel: SELinux:  policy capability always_check_network=0
Oct 01 13:33:08 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 01 13:33:08 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 01 13:33:08 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 01 13:33:09 compute-0 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Oct 01 13:33:09 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 01 13:33:09 compute-0 systemd[1]: Starting man-db-cache-update.service...
Oct 01 13:33:09 compute-0 systemd[1]: Reloading.
Oct 01 13:33:09 compute-0 systemd-rc-local-generator[31147]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:33:09 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Oct 01 13:33:09 compute-0 systemd[1]: Starting PackageKit Daemon...
Oct 01 13:33:09 compute-0 PackageKit[31418]: daemon start
Oct 01 13:33:09 compute-0 systemd[1]: Started PackageKit Daemon.
Oct 01 13:33:10 compute-0 sudo[30498]: pam_unix(sudo:session): session closed for user root
Oct 01 13:33:10 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct 01 13:33:10 compute-0 systemd[1]: Finished man-db-cache-update.service.
Oct 01 13:33:10 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.319s CPU time.
Oct 01 13:33:10 compute-0 systemd[1]: run-r403a4138f3fc46b3a56c8718ce12c31d.service: Deactivated successfully.
Oct 01 13:33:10 compute-0 sudo[32062]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmvqfyfheashtschkkugpdmewechycic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325590.3196955-291-42752126494516/AnsiballZ_command.py'
Oct 01 13:33:10 compute-0 sudo[32062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:33:10 compute-0 python3.9[32064]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:33:12 compute-0 sudo[32062]: pam_unix(sudo:session): session closed for user root
Oct 01 13:33:13 compute-0 sudo[32343]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjikwvzaotdjxtlpzxmcpdwpfnymwgdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325592.5848556-307-110046221383041/AnsiballZ_selinux.py'
Oct 01 13:33:13 compute-0 sudo[32343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:33:13 compute-0 python3.9[32345]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Oct 01 13:33:13 compute-0 sudo[32343]: pam_unix(sudo:session): session closed for user root
Oct 01 13:33:14 compute-0 sudo[32495]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drpehsbfuubuwqhudkctokqhvpyjmbfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325593.9739747-329-188901779286947/AnsiballZ_command.py'
Oct 01 13:33:14 compute-0 sudo[32495]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:33:14 compute-0 python3.9[32497]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Oct 01 13:33:15 compute-0 sudo[32495]: pam_unix(sudo:session): session closed for user root
Oct 01 13:33:16 compute-0 sudo[32648]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-apoxkvkevtnykbmucxrnelzpgtmhudst ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325595.8743777-345-16086555648915/AnsiballZ_file.py'
Oct 01 13:33:16 compute-0 sudo[32648]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:33:16 compute-0 python3.9[32650]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:33:16 compute-0 sudo[32648]: pam_unix(sudo:session): session closed for user root
Oct 01 13:33:18 compute-0 sudo[32800]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rimxctihslvyvgfzfqchrannnlbktubq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325597.8771493-361-96497746165586/AnsiballZ_mount.py'
Oct 01 13:33:18 compute-0 sudo[32800]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:33:18 compute-0 python3.9[32802]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Oct 01 13:33:18 compute-0 sudo[32800]: pam_unix(sudo:session): session closed for user root
Oct 01 13:33:22 compute-0 sudo[32952]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmwbcqgiywftgytyklpdufybarfczwwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325602.6621711-417-44083733305779/AnsiballZ_file.py'
Oct 01 13:33:22 compute-0 sudo[32952]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:33:24 compute-0 python3.9[32954]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:33:24 compute-0 sudo[32952]: pam_unix(sudo:session): session closed for user root
Oct 01 13:33:24 compute-0 sudo[33104]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewwrczqoxbyevhqpmypbtwhqnmebisua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325604.4563766-433-172879118782204/AnsiballZ_stat.py'
Oct 01 13:33:24 compute-0 sudo[33104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:33:24 compute-0 python3.9[33106]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:33:24 compute-0 sudo[33104]: pam_unix(sudo:session): session closed for user root
Oct 01 13:33:25 compute-0 sudo[33227]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wulunpwzreetjsoipmmquzkurakskpvy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325604.4563766-433-172879118782204/AnsiballZ_copy.py'
Oct 01 13:33:25 compute-0 sudo[33227]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:33:25 compute-0 python3.9[33229]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759325604.4563766-433-172879118782204/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=768c0bf9fa9273d82e48b91de3840276afe8c79e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:33:25 compute-0 sudo[33227]: pam_unix(sudo:session): session closed for user root
Oct 01 13:33:27 compute-0 sudo[33379]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zorhjgbxtxexeqawocncjanazfjcqfwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325606.6259906-487-48521842510354/AnsiballZ_getent.py'
Oct 01 13:33:27 compute-0 sudo[33379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:33:27 compute-0 python3.9[33381]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Oct 01 13:33:27 compute-0 sudo[33379]: pam_unix(sudo:session): session closed for user root
Oct 01 13:33:28 compute-0 sudo[33532]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fydmpokcwtnluttdcouigopznmabywvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325607.7286007-503-50966198137770/AnsiballZ_group.py'
Oct 01 13:33:28 compute-0 sudo[33532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:33:28 compute-0 python3.9[33534]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct 01 13:33:28 compute-0 groupadd[33535]: group added to /etc/group: name=qemu, GID=107
Oct 01 13:33:28 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 01 13:33:28 compute-0 groupadd[33535]: group added to /etc/gshadow: name=qemu
Oct 01 13:33:28 compute-0 groupadd[33535]: new group: name=qemu, GID=107
Oct 01 13:33:28 compute-0 sudo[33532]: pam_unix(sudo:session): session closed for user root
Oct 01 13:33:29 compute-0 sudo[33691]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vuroisbzdiqkhzhmwyvuwwtlxpxaiccr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325609.0275843-519-144018326172246/AnsiballZ_user.py'
Oct 01 13:33:29 compute-0 sudo[33691]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:33:29 compute-0 python3.9[33693]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct 01 13:33:29 compute-0 useradd[33695]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=/dev/pts/0
Oct 01 13:33:29 compute-0 sudo[33691]: pam_unix(sudo:session): session closed for user root
Oct 01 13:33:31 compute-0 sudo[33851]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtavtpsuaxwlzbuijyrjjycgctnkdyoy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325611.1842558-535-218211388976898/AnsiballZ_getent.py'
Oct 01 13:33:31 compute-0 sudo[33851]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:33:31 compute-0 python3.9[33853]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Oct 01 13:33:31 compute-0 sudo[33851]: pam_unix(sudo:session): session closed for user root
Oct 01 13:33:32 compute-0 sudo[34004]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnylhavhobggfuetyowvxnicdntfgpvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325612.0448565-551-219212334621701/AnsiballZ_group.py'
Oct 01 13:33:32 compute-0 sudo[34004]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:33:32 compute-0 python3.9[34006]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct 01 13:33:32 compute-0 groupadd[34007]: group added to /etc/group: name=hugetlbfs, GID=42477
Oct 01 13:33:32 compute-0 groupadd[34007]: group added to /etc/gshadow: name=hugetlbfs
Oct 01 13:33:32 compute-0 groupadd[34007]: new group: name=hugetlbfs, GID=42477
Oct 01 13:33:32 compute-0 sudo[34004]: pam_unix(sudo:session): session closed for user root
Oct 01 13:33:33 compute-0 sudo[34162]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-byflayimujdyiltlatwostpvusukfwlq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325612.8659492-569-97172440032456/AnsiballZ_file.py'
Oct 01 13:33:33 compute-0 sudo[34162]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:33:33 compute-0 python3.9[34164]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Oct 01 13:33:33 compute-0 sudo[34162]: pam_unix(sudo:session): session closed for user root
Oct 01 13:33:34 compute-0 sudo[34314]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zkjhfrmejdftaafvnfvknnyxwbxdphgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325613.9077888-591-52848179443247/AnsiballZ_dnf.py'
Oct 01 13:33:34 compute-0 sudo[34314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:33:34 compute-0 python3.9[34316]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 01 13:33:36 compute-0 sudo[34314]: pam_unix(sudo:session): session closed for user root
Oct 01 13:33:37 compute-0 sudo[34467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-empkmjlgjipyclsmbodarvfhjvlhnqix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325616.8821042-607-41501965408691/AnsiballZ_file.py'
Oct 01 13:33:37 compute-0 sudo[34467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:33:37 compute-0 python3.9[34469]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:33:37 compute-0 sudo[34467]: pam_unix(sudo:session): session closed for user root
Oct 01 13:33:37 compute-0 sudo[34619]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vplttwcixgagdwcouxwhkmliduysally ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325617.667974-623-57748489271825/AnsiballZ_stat.py'
Oct 01 13:33:37 compute-0 sudo[34619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:33:38 compute-0 python3.9[34621]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:33:38 compute-0 sudo[34619]: pam_unix(sudo:session): session closed for user root
Oct 01 13:33:38 compute-0 sudo[34742]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwskctgzizrughzarqruwebaozncidqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325617.667974-623-57748489271825/AnsiballZ_copy.py'
Oct 01 13:33:38 compute-0 sudo[34742]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:33:38 compute-0 python3.9[34744]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759325617.667974-623-57748489271825/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:33:38 compute-0 sudo[34742]: pam_unix(sudo:session): session closed for user root
Oct 01 13:33:39 compute-0 sudo[34894]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itvdoeimjclkricrjnatocscihxkvsrq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325619.0241268-653-145154486588420/AnsiballZ_systemd.py'
Oct 01 13:33:39 compute-0 sudo[34894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:33:40 compute-0 python3.9[34896]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 01 13:33:40 compute-0 systemd[1]: Starting Load Kernel Modules...
Oct 01 13:33:40 compute-0 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Oct 01 13:33:40 compute-0 kernel: Bridge firewalling registered
Oct 01 13:33:40 compute-0 systemd-modules-load[34900]: Inserted module 'br_netfilter'
Oct 01 13:33:40 compute-0 systemd[1]: Finished Load Kernel Modules.
Oct 01 13:33:40 compute-0 sudo[34894]: pam_unix(sudo:session): session closed for user root
Oct 01 13:33:40 compute-0 sudo[35053]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixqmicszkzvaemisyjujoroeqmsqvcsx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325620.416181-669-134645803583082/AnsiballZ_stat.py'
Oct 01 13:33:40 compute-0 sudo[35053]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:33:40 compute-0 python3.9[35055]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:33:40 compute-0 sudo[35053]: pam_unix(sudo:session): session closed for user root
Oct 01 13:33:41 compute-0 sudo[35176]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwcnrbujuumtztbsfyhbcfssjsbullzj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325620.416181-669-134645803583082/AnsiballZ_copy.py'
Oct 01 13:33:41 compute-0 sudo[35176]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:33:41 compute-0 python3.9[35178]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759325620.416181-669-134645803583082/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:33:41 compute-0 sudo[35176]: pam_unix(sudo:session): session closed for user root
Oct 01 13:33:42 compute-0 sudo[35328]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldgmsguhrpkhbbibfqowwztqidpsyelc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325622.0040255-705-171590009448799/AnsiballZ_dnf.py'
Oct 01 13:33:42 compute-0 sudo[35328]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:33:42 compute-0 python3.9[35330]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 01 13:33:46 compute-0 dbus-broker-launch[748]: Noticed file-system modification, trigger reload.
Oct 01 13:33:46 compute-0 dbus-broker-launch[748]: Noticed file-system modification, trigger reload.
Oct 01 13:33:46 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 01 13:33:46 compute-0 systemd[1]: Starting man-db-cache-update.service...
Oct 01 13:33:46 compute-0 systemd[1]: Reloading.
Oct 01 13:33:46 compute-0 systemd-rc-local-generator[35392]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:33:46 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Oct 01 13:33:47 compute-0 sudo[35328]: pam_unix(sudo:session): session closed for user root
Oct 01 13:33:48 compute-0 python3.9[36895]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 01 13:33:49 compute-0 python3.9[38006]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Oct 01 13:33:49 compute-0 python3.9[38717]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 01 13:33:50 compute-0 sudo[39499]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jkpyewvwgdqjxnqwawcokquivookwfva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325630.2793987-783-275279232980340/AnsiballZ_command.py'
Oct 01 13:33:50 compute-0 sudo[39499]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:33:50 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct 01 13:33:50 compute-0 systemd[1]: Finished man-db-cache-update.service.
Oct 01 13:33:50 compute-0 systemd[1]: man-db-cache-update.service: Consumed 4.980s CPU time.
Oct 01 13:33:50 compute-0 systemd[1]: run-rd11608e2ef964c0da0ac7e7aec396fa2.service: Deactivated successfully.
Oct 01 13:33:50 compute-0 python3.9[39501]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:33:50 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Oct 01 13:33:51 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Oct 01 13:33:51 compute-0 sudo[39499]: pam_unix(sudo:session): session closed for user root
Oct 01 13:33:52 compute-0 sudo[39873]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-snsjqvlwbmjnlflmgnljluwahjqnaqlu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325632.0305614-801-43696128676314/AnsiballZ_systemd.py'
Oct 01 13:33:52 compute-0 sudo[39873]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:33:52 compute-0 python3.9[39875]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 01 13:33:52 compute-0 systemd[1]: Stopping Dynamic System Tuning Daemon...
Oct 01 13:33:52 compute-0 systemd[1]: tuned.service: Deactivated successfully.
Oct 01 13:33:52 compute-0 systemd[1]: Stopped Dynamic System Tuning Daemon.
Oct 01 13:33:52 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Oct 01 13:33:53 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Oct 01 13:33:53 compute-0 sudo[39873]: pam_unix(sudo:session): session closed for user root
Oct 01 13:33:53 compute-0 python3.9[40036]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Oct 01 13:33:56 compute-0 sudo[40186]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwqbdmnmqkufgbmvqszdhyyhatqyitxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325635.9209163-915-155577069172325/AnsiballZ_systemd.py'
Oct 01 13:33:56 compute-0 sudo[40186]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:33:56 compute-0 python3.9[40188]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 01 13:33:56 compute-0 systemd[1]: Reloading.
Oct 01 13:33:56 compute-0 systemd-rc-local-generator[40218]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:33:56 compute-0 sudo[40186]: pam_unix(sudo:session): session closed for user root
Oct 01 13:33:57 compute-0 sudo[40375]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skdicxhxjrrutvoiavavlpvfjxeaeqxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325637.0128531-915-28580417946864/AnsiballZ_systemd.py'
Oct 01 13:33:57 compute-0 sudo[40375]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:33:57 compute-0 python3.9[40377]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 01 13:33:57 compute-0 systemd[1]: Reloading.
Oct 01 13:33:57 compute-0 systemd-rc-local-generator[40402]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:33:57 compute-0 sudo[40375]: pam_unix(sudo:session): session closed for user root
Oct 01 13:33:58 compute-0 sudo[40564]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnkchnannsbxjhfzmfvwlqcrppyxtusj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325638.2795815-947-79291266290769/AnsiballZ_command.py'
Oct 01 13:33:58 compute-0 sudo[40564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:33:58 compute-0 python3.9[40566]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:33:59 compute-0 sudo[40564]: pam_unix(sudo:session): session closed for user root
Oct 01 13:33:59 compute-0 sudo[40717]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etnksisyzyplnhpwvbcqbekoacsltkrf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325639.3561902-963-35438882646192/AnsiballZ_command.py'
Oct 01 13:33:59 compute-0 sudo[40717]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:33:59 compute-0 python3.9[40719]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:33:59 compute-0 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Oct 01 13:33:59 compute-0 sudo[40717]: pam_unix(sudo:session): session closed for user root
Oct 01 13:34:00 compute-0 sudo[40870]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hsqgfmwhfffcxeleakerjbrqdhfpmwbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325640.1307206-979-18515066072345/AnsiballZ_command.py'
Oct 01 13:34:00 compute-0 sudo[40870]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:34:00 compute-0 python3.9[40872]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:34:02 compute-0 sudo[40870]: pam_unix(sudo:session): session closed for user root
Oct 01 13:34:02 compute-0 sudo[41032]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jeqqdpdnjrgciwpzoducyijeovaugvwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325642.3812075-995-171228713129632/AnsiballZ_command.py'
Oct 01 13:34:02 compute-0 sudo[41032]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:34:02 compute-0 python3.9[41034]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:34:02 compute-0 sudo[41032]: pam_unix(sudo:session): session closed for user root
Oct 01 13:34:03 compute-0 sudo[41185]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uaxxjvjiswjfpfovefwftqeqschugmjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325643.1260035-1011-9807510573269/AnsiballZ_systemd.py'
Oct 01 13:34:03 compute-0 sudo[41185]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:34:03 compute-0 python3.9[41187]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 01 13:34:03 compute-0 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Oct 01 13:34:03 compute-0 systemd[1]: Stopped Apply Kernel Variables.
Oct 01 13:34:03 compute-0 systemd[1]: Stopping Apply Kernel Variables...
Oct 01 13:34:03 compute-0 systemd[1]: Starting Apply Kernel Variables...
Oct 01 13:34:03 compute-0 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Oct 01 13:34:03 compute-0 systemd[1]: Finished Apply Kernel Variables.
Oct 01 13:34:03 compute-0 sudo[41185]: pam_unix(sudo:session): session closed for user root
Oct 01 13:34:04 compute-0 sshd-session[28185]: Connection closed by 192.168.122.30 port 34100
Oct 01 13:34:04 compute-0 sshd-session[28182]: pam_unix(sshd:session): session closed for user zuul
Oct 01 13:34:04 compute-0 systemd[1]: session-10.scope: Deactivated successfully.
Oct 01 13:34:04 compute-0 systemd[1]: session-10.scope: Consumed 2min 13.281s CPU time.
Oct 01 13:34:04 compute-0 systemd-logind[791]: Session 10 logged out. Waiting for processes to exit.
Oct 01 13:34:04 compute-0 systemd-logind[791]: Removed session 10.
Oct 01 13:34:09 compute-0 sshd-session[41217]: Accepted publickey for zuul from 192.168.122.30 port 54736 ssh2: ECDSA SHA256:G/wBH4NemtaB5A4Xrsc6R+GZmi6HC8VbviS/FKhdd8M
Oct 01 13:34:09 compute-0 systemd-logind[791]: New session 11 of user zuul.
Oct 01 13:34:09 compute-0 systemd[1]: Started Session 11 of User zuul.
Oct 01 13:34:09 compute-0 sshd-session[41217]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 01 13:34:10 compute-0 python3.9[41370]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 01 13:34:12 compute-0 python3.9[41524]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 01 13:34:13 compute-0 sudo[41678]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xttcqccrlyeidagrhgizmsrzkzizqere ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325652.7677298-80-49747551450576/AnsiballZ_command.py'
Oct 01 13:34:13 compute-0 sudo[41678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:34:13 compute-0 python3.9[41680]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:34:13 compute-0 sudo[41678]: pam_unix(sudo:session): session closed for user root
Oct 01 13:34:14 compute-0 python3.9[41831]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 01 13:34:15 compute-0 sudo[41985]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdkkcjlfgvvvhjraqcghxqqndnfroawz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325654.8576334-120-4298580217551/AnsiballZ_setup.py'
Oct 01 13:34:15 compute-0 sudo[41985]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:34:15 compute-0 python3.9[41987]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 01 13:34:15 compute-0 sudo[41985]: pam_unix(sudo:session): session closed for user root
Oct 01 13:34:16 compute-0 sudo[42069]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qygesilveoftelfllgxzvnstevboedzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325654.8576334-120-4298580217551/AnsiballZ_dnf.py'
Oct 01 13:34:16 compute-0 sudo[42069]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:34:16 compute-0 python3.9[42071]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 01 13:34:17 compute-0 sudo[42069]: pam_unix(sudo:session): session closed for user root
Oct 01 13:34:18 compute-0 sudo[42222]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nljhvifiuqpdnffjcohevbxddzfyrpqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325657.6451557-144-179123715692714/AnsiballZ_setup.py'
Oct 01 13:34:18 compute-0 sudo[42222]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:34:18 compute-0 python3.9[42224]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 01 13:34:18 compute-0 sudo[42222]: pam_unix(sudo:session): session closed for user root
Oct 01 13:34:19 compute-0 sudo[42393]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxtcjjrcbhcmtovhzlilulisvojusxqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325658.7312891-166-239719758165236/AnsiballZ_file.py'
Oct 01 13:34:19 compute-0 sudo[42393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:34:19 compute-0 python3.9[42395]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:34:19 compute-0 sudo[42393]: pam_unix(sudo:session): session closed for user root
Oct 01 13:34:19 compute-0 sudo[42545]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-igjcueluilpfvsgvwktvcogedndwayqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325659.6240559-182-197378492387120/AnsiballZ_command.py'
Oct 01 13:34:19 compute-0 sudo[42545]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:34:20 compute-0 python3.9[42547]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:34:20 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat374862180-merged.mount: Deactivated successfully.
Oct 01 13:34:20 compute-0 systemd[1]: var-lib-containers-storage-overlay-metacopy\x2dcheck2114668845-merged.mount: Deactivated successfully.
Oct 01 13:34:20 compute-0 podman[42548]: 2025-10-01 13:34:20.180104812 +0000 UTC m=+0.072406396 system refresh
Oct 01 13:34:20 compute-0 sudo[42545]: pam_unix(sudo:session): session closed for user root
Oct 01 13:34:20 compute-0 sudo[42708]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ticjrktooaxgqxfbtwlxfifqinilwgtx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325660.3950582-198-31885840519439/AnsiballZ_stat.py'
Oct 01 13:34:20 compute-0 sudo[42708]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:34:21 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 01 13:34:21 compute-0 python3.9[42710]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:34:21 compute-0 sudo[42708]: pam_unix(sudo:session): session closed for user root
Oct 01 13:34:21 compute-0 sudo[42831]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgakemvpkhhmjhebbbhisiksghhbqqzz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325660.3950582-198-31885840519439/AnsiballZ_copy.py'
Oct 01 13:34:21 compute-0 sudo[42831]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:34:21 compute-0 python3.9[42833]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759325660.3950582-198-31885840519439/.source.json follow=False _original_basename=podman_network_config.j2 checksum=a73ba6b1c69cb97f119120682afd4b31447dbf26 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:34:21 compute-0 sudo[42831]: pam_unix(sudo:session): session closed for user root
Oct 01 13:34:22 compute-0 sudo[42983]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwcfzhsctvjzrfhvbcbwdfqnbmghkiqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325662.1360111-228-21040740233222/AnsiballZ_stat.py'
Oct 01 13:34:22 compute-0 sudo[42983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:34:22 compute-0 python3.9[42985]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:34:22 compute-0 sudo[42983]: pam_unix(sudo:session): session closed for user root
Oct 01 13:34:23 compute-0 sudo[43106]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbqqhibtsrjwysttexdsswggzbahmegk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325662.1360111-228-21040740233222/AnsiballZ_copy.py'
Oct 01 13:34:23 compute-0 sudo[43106]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:34:23 compute-0 python3.9[43108]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759325662.1360111-228-21040740233222/.source.conf follow=False _original_basename=registries.conf.j2 checksum=b723c254c5347521a0bd9978182359a7d08823fc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:34:23 compute-0 sudo[43106]: pam_unix(sudo:session): session closed for user root
Oct 01 13:34:24 compute-0 sudo[43258]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-audfaysduoujixpwixgzthndwveqzhyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325663.5963342-260-256790291338538/AnsiballZ_ini_file.py'
Oct 01 13:34:24 compute-0 sudo[43258]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:34:24 compute-0 python3.9[43260]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:34:24 compute-0 sudo[43258]: pam_unix(sudo:session): session closed for user root
Oct 01 13:34:24 compute-0 sudo[43410]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmmpxuzzmwklbhindsgvrxgoheglydnc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325664.4452536-260-173630731332492/AnsiballZ_ini_file.py'
Oct 01 13:34:24 compute-0 sudo[43410]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:34:25 compute-0 python3.9[43412]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:34:25 compute-0 sudo[43410]: pam_unix(sudo:session): session closed for user root
Oct 01 13:34:25 compute-0 sudo[43562]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aaacyxtknhpzcucvhrvqnonhlkwtyvcr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325665.1860054-260-247686332893430/AnsiballZ_ini_file.py'
Oct 01 13:34:25 compute-0 sudo[43562]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:34:25 compute-0 python3.9[43564]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:34:25 compute-0 sudo[43562]: pam_unix(sudo:session): session closed for user root
Oct 01 13:34:26 compute-0 sudo[43714]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nixoxirfdganwvzxfdcuysydqqirhzbi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325665.8596756-260-165063763233489/AnsiballZ_ini_file.py'
Oct 01 13:34:26 compute-0 sudo[43714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:34:26 compute-0 python3.9[43716]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:34:26 compute-0 sudo[43714]: pam_unix(sudo:session): session closed for user root
Oct 01 13:34:27 compute-0 python3.9[43866]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 01 13:34:28 compute-0 sudo[44018]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qyiatesofpohjdiywmnbzfsuyspzwheh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325667.7044764-340-252370360869629/AnsiballZ_dnf.py'
Oct 01 13:34:28 compute-0 sudo[44018]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:34:28 compute-0 python3.9[44020]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct 01 13:34:29 compute-0 sudo[44018]: pam_unix(sudo:session): session closed for user root
Oct 01 13:34:30 compute-0 sudo[44171]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-msvrhoqahtyzxldvullzbwsxpivyqkpa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325669.688381-356-40803366218491/AnsiballZ_dnf.py'
Oct 01 13:34:30 compute-0 sudo[44171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:34:30 compute-0 python3.9[44173]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct 01 13:34:32 compute-0 sudo[44171]: pam_unix(sudo:session): session closed for user root
Oct 01 13:34:32 compute-0 sudo[44331]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzkvxervvybehfobsaqwwpnjjqyzdlvv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325672.5531173-376-261198556568782/AnsiballZ_dnf.py'
Oct 01 13:34:32 compute-0 sudo[44331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:34:33 compute-0 python3.9[44333]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct 01 13:34:34 compute-0 sudo[44331]: pam_unix(sudo:session): session closed for user root
Oct 01 13:34:34 compute-0 sudo[44484]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsjcetfaqvuahqlhxuxjutcvvrgdzohs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325674.4889026-394-221010481373134/AnsiballZ_dnf.py'
Oct 01 13:34:34 compute-0 sudo[44484]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:34:35 compute-0 python3.9[44486]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct 01 13:34:36 compute-0 sudo[44484]: pam_unix(sudo:session): session closed for user root
Oct 01 13:34:36 compute-0 sudo[44637]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wyzgebbmeocqfpzzuonodrckqpsfyemv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325676.5656843-416-229893552489134/AnsiballZ_dnf.py'
Oct 01 13:34:36 compute-0 sudo[44637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:34:37 compute-0 python3.9[44639]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct 01 13:34:38 compute-0 sudo[44637]: pam_unix(sudo:session): session closed for user root
Oct 01 13:34:39 compute-0 sudo[44793]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tzhwivzwmjfdknenazpzezmlcugtrfsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325679.071329-432-8492405630645/AnsiballZ_dnf.py'
Oct 01 13:34:39 compute-0 sudo[44793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:34:39 compute-0 python3.9[44795]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct 01 13:34:46 compute-0 sudo[44793]: pam_unix(sudo:session): session closed for user root
Oct 01 13:34:47 compute-0 sudo[44961]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pkczfkljftpeciypxdgkxnnmpmgfxrst ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325687.1409113-450-75272318332669/AnsiballZ_dnf.py'
Oct 01 13:34:47 compute-0 sudo[44961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:34:47 compute-0 python3.9[44963]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct 01 13:34:48 compute-0 sudo[44961]: pam_unix(sudo:session): session closed for user root
Oct 01 13:34:49 compute-0 sudo[45114]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvgliovjagmcnggfgsposdccsybzajra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325689.2831943-468-166581965807387/AnsiballZ_dnf.py'
Oct 01 13:34:49 compute-0 sudo[45114]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:34:49 compute-0 python3.9[45116]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct 01 13:35:05 compute-0 sudo[45114]: pam_unix(sudo:session): session closed for user root
Oct 01 13:35:06 compute-0 sudo[45450]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpdohcfdwzszaxabgfrktwsfguhfxfwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325706.2966418-490-4829725082072/AnsiballZ_file.py'
Oct 01 13:35:06 compute-0 sudo[45450]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:35:06 compute-0 python3.9[45452]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:35:06 compute-0 sudo[45450]: pam_unix(sudo:session): session closed for user root
Oct 01 13:35:07 compute-0 sudo[45625]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nvszbpikzlkmkbysfgmnfyheiurxffzs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325707.156388-506-261641476203488/AnsiballZ_stat.py'
Oct 01 13:35:07 compute-0 sudo[45625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:35:07 compute-0 python3.9[45627]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:35:07 compute-0 sudo[45625]: pam_unix(sudo:session): session closed for user root
Oct 01 13:35:08 compute-0 sudo[45748]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-snrshvttpjghokrijngzzkusunuehqdu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325707.156388-506-261641476203488/AnsiballZ_copy.py'
Oct 01 13:35:08 compute-0 sudo[45748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:35:08 compute-0 python3.9[45750]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1759325707.156388-506-261641476203488/.source.json _original_basename=.843awvkz follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:35:08 compute-0 sudo[45748]: pam_unix(sudo:session): session closed for user root
Oct 01 13:35:09 compute-0 sudo[45900]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfsptzfsrebqsyywxbeblnnymjaffyck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325708.9351268-542-52221320860656/AnsiballZ_podman_image.py'
Oct 01 13:35:09 compute-0 sudo[45900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:35:09 compute-0 python3.9[45902]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct 01 13:35:09 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 01 13:35:12 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat249426138-lower\x2dmapped.mount: Deactivated successfully.
Oct 01 13:35:17 compute-0 podman[45915]: 2025-10-01 13:35:17.655266434 +0000 UTC m=+7.814172732 image pull a742884d734e475a9ceb7e186a2d8775781675f700ff62f05c1b64d66e08b90f 38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest
Oct 01 13:35:17 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 01 13:35:17 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 01 13:35:17 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 01 13:35:17 compute-0 sudo[45900]: pam_unix(sudo:session): session closed for user root
Oct 01 13:35:18 compute-0 sudo[46211]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-timphyyswkeokzfwqaowwtiedqxkjgkr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325718.4145162-560-35746723926491/AnsiballZ_podman_image.py'
Oct 01 13:35:18 compute-0 sudo[46211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:35:18 compute-0 python3.9[46213]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct 01 13:35:19 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 01 13:35:20 compute-0 podman[46225]: 2025-10-01 13:35:20.657017972 +0000 UTC m=+1.618890552 image pull c8ef9d5640b125c1f3577d8f712edab51eb0591f40b9f49028ec5b54753f0392 38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest
Oct 01 13:35:20 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 01 13:35:20 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 01 13:35:20 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 01 13:35:20 compute-0 sudo[46211]: pam_unix(sudo:session): session closed for user root
Oct 01 13:35:21 compute-0 sudo[46476]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdbfcyewjdwnscpvkzlplkdsqgsvoqnn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325721.3173752-582-175446019268889/AnsiballZ_podman_image.py'
Oct 01 13:35:21 compute-0 sudo[46476]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:35:21 compute-0 python3.9[46478]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct 01 13:35:21 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 01 13:35:31 compute-0 podman[46491]: 2025-10-01 13:35:31.713713778 +0000 UTC m=+9.813925328 image pull 0c139338a67144a0d88e07ef5f38b20d3085af4a1586fd8115d3776c8f9c633c 38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Oct 01 13:35:31 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 01 13:35:31 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 01 13:35:31 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 01 13:35:31 compute-0 sudo[46476]: pam_unix(sudo:session): session closed for user root
Oct 01 13:35:32 compute-0 sudo[46766]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cnrtobkripexjfwjjcpkpwzpbulehlgi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325732.3280299-602-127884346160340/AnsiballZ_podman_image.py'
Oct 01 13:35:32 compute-0 sudo[46766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:35:32 compute-0 python3.9[46768]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct 01 13:35:32 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 01 13:35:33 compute-0 podman[46780]: 2025-10-01 13:35:33.405108837 +0000 UTC m=+0.432720177 image pull cb9980503d2e559b80f837e5c1ae5a83d16cee2e99b876ecd89624c1b09d1eaa 38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest
Oct 01 13:35:33 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 01 13:35:33 compute-0 sudo[46766]: pam_unix(sudo:session): session closed for user root
Oct 01 13:35:34 compute-0 sudo[47014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjcupsdahkfwtdluujecjkpdsusuyozf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325733.8823822-620-249301319504866/AnsiballZ_podman_image.py'
Oct 01 13:35:34 compute-0 sudo[47014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:35:34 compute-0 python3.9[47016]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct 01 13:35:34 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 01 13:35:36 compute-0 sshd-session[47046]: Invalid user usuario from 80.94.95.116 port 33782
Oct 01 13:35:36 compute-0 sshd-session[47046]: Connection closed by invalid user usuario 80.94.95.116 port 33782 [preauth]
Oct 01 13:35:46 compute-0 podman[47028]: 2025-10-01 13:35:46.095784839 +0000 UTC m=+11.659369322 image pull 656799db0d65542d0e8e413e509a07d242723dfb9640eb11bd2cd711d3cef64f 38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest
Oct 01 13:35:46 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 01 13:35:46 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 01 13:35:46 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 01 13:35:46 compute-0 sudo[47014]: pam_unix(sudo:session): session closed for user root
Oct 01 13:35:50 compute-0 sudo[47305]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gircwodlpmmkjoatpzvsldzilbsilphs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325750.0385933-642-266913268897264/AnsiballZ_podman_image.py'
Oct 01 13:35:50 compute-0 sudo[47305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:35:50 compute-0 python3.9[47307]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.30:5001/podified-master-centos10/openstack-ceilometer-compute:watcher_latest tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct 01 13:35:50 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 01 13:35:53 compute-0 podman[47321]: 2025-10-01 13:35:53.954771053 +0000 UTC m=+3.306447039 image pull d9e9c0111b5be0a1d808f459e84dfaeca66d60ccd3375555faf17ce14bd51b9d 38.102.83.30:5001/podified-master-centos10/openstack-ceilometer-compute:watcher_latest
Oct 01 13:35:53 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 01 13:35:54 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 01 13:35:54 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 01 13:35:54 compute-0 sudo[47305]: pam_unix(sudo:session): session closed for user root
Oct 01 13:35:54 compute-0 sudo[47576]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ekpukqvknnoifjbwyuclpwsepodefaco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325754.322629-642-167748332710093/AnsiballZ_podman_image.py'
Oct 01 13:35:54 compute-0 sudo[47576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:35:54 compute-0 python3.9[47578]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter:v1.5.0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct 01 13:35:56 compute-0 podman[47589]: 2025-10-01 13:35:56.262231801 +0000 UTC m=+1.408645806 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Oct 01 13:35:56 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 01 13:35:56 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 01 13:35:56 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 01 13:35:56 compute-0 sudo[47576]: pam_unix(sudo:session): session closed for user root
Oct 01 13:35:59 compute-0 sshd-session[41220]: Connection closed by 192.168.122.30 port 54736
Oct 01 13:35:59 compute-0 sshd-session[41217]: pam_unix(sshd:session): session closed for user zuul
Oct 01 13:35:59 compute-0 systemd[1]: session-11.scope: Deactivated successfully.
Oct 01 13:35:59 compute-0 systemd[1]: session-11.scope: Consumed 1min 54.699s CPU time.
Oct 01 13:35:59 compute-0 systemd-logind[791]: Session 11 logged out. Waiting for processes to exit.
Oct 01 13:35:59 compute-0 systemd-logind[791]: Removed session 11.
Oct 01 13:36:04 compute-0 sshd-session[47741]: Accepted publickey for zuul from 192.168.122.30 port 40418 ssh2: ECDSA SHA256:G/wBH4NemtaB5A4Xrsc6R+GZmi6HC8VbviS/FKhdd8M
Oct 01 13:36:04 compute-0 systemd-logind[791]: New session 12 of user zuul.
Oct 01 13:36:04 compute-0 systemd[1]: Started Session 12 of User zuul.
Oct 01 13:36:04 compute-0 sshd-session[47741]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 01 13:36:05 compute-0 python3.9[47894]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 01 13:36:07 compute-0 sudo[48048]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvlbqabukqakprigogjfkdqccrzshshy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325766.6176114-52-136870513563520/AnsiballZ_getent.py'
Oct 01 13:36:07 compute-0 sudo[48048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:36:07 compute-0 python3.9[48050]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Oct 01 13:36:07 compute-0 sudo[48048]: pam_unix(sudo:session): session closed for user root
Oct 01 13:36:08 compute-0 sudo[48201]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjlflywfglmgszgxridmdacqhidnbbjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325767.573784-68-197928126526325/AnsiballZ_group.py'
Oct 01 13:36:08 compute-0 sudo[48201]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:36:08 compute-0 python3.9[48203]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct 01 13:36:08 compute-0 groupadd[48204]: group added to /etc/group: name=openvswitch, GID=42476
Oct 01 13:36:08 compute-0 groupadd[48204]: group added to /etc/gshadow: name=openvswitch
Oct 01 13:36:08 compute-0 groupadd[48204]: new group: name=openvswitch, GID=42476
Oct 01 13:36:08 compute-0 sudo[48201]: pam_unix(sudo:session): session closed for user root
Oct 01 13:36:09 compute-0 sudo[48359]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhygkygkcqyjtoedlavwyarqaiktkqgo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325768.6028059-84-201143296314260/AnsiballZ_user.py'
Oct 01 13:36:09 compute-0 sudo[48359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:36:09 compute-0 python3.9[48361]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct 01 13:36:09 compute-0 useradd[48363]: new user: name=openvswitch, UID=42476, GID=42476, home=/home/openvswitch, shell=/sbin/nologin, from=/dev/pts/0
Oct 01 13:36:09 compute-0 useradd[48363]: add 'openvswitch' to group 'hugetlbfs'
Oct 01 13:36:09 compute-0 useradd[48363]: add 'openvswitch' to shadow group 'hugetlbfs'
Oct 01 13:36:09 compute-0 sudo[48359]: pam_unix(sudo:session): session closed for user root
Oct 01 13:36:10 compute-0 sudo[48519]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqdvcjlqenqcynrroqyhutjawjwsonmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325769.953435-104-115252610610463/AnsiballZ_setup.py'
Oct 01 13:36:10 compute-0 sudo[48519]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:36:10 compute-0 python3.9[48521]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 01 13:36:10 compute-0 sudo[48519]: pam_unix(sudo:session): session closed for user root
Oct 01 13:36:11 compute-0 sudo[48603]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqadzugicqrspzwlolxfpiqnuzcupadl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325769.953435-104-115252610610463/AnsiballZ_dnf.py'
Oct 01 13:36:11 compute-0 sudo[48603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:36:11 compute-0 python3.9[48605]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct 01 13:36:13 compute-0 sudo[48603]: pam_unix(sudo:session): session closed for user root
Oct 01 13:36:13 compute-0 sudo[48764]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgywqotagdhcnjyejzphhzqgrtozsmet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325773.465894-132-2545568729637/AnsiballZ_dnf.py'
Oct 01 13:36:13 compute-0 sudo[48764]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:36:14 compute-0 python3.9[48766]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 01 13:36:25 compute-0 kernel: SELinux:  Converting 2727 SID table entries...
Oct 01 13:36:25 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Oct 01 13:36:25 compute-0 kernel: SELinux:  policy capability open_perms=1
Oct 01 13:36:25 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Oct 01 13:36:25 compute-0 kernel: SELinux:  policy capability always_check_network=0
Oct 01 13:36:25 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 01 13:36:25 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 01 13:36:25 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 01 13:36:25 compute-0 groupadd[48790]: group added to /etc/group: name=unbound, GID=993
Oct 01 13:36:25 compute-0 groupadd[48790]: group added to /etc/gshadow: name=unbound
Oct 01 13:36:25 compute-0 groupadd[48790]: new group: name=unbound, GID=993
Oct 01 13:36:25 compute-0 useradd[48797]: new user: name=unbound, UID=993, GID=993, home=/var/lib/unbound, shell=/sbin/nologin, from=none
Oct 01 13:36:25 compute-0 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Oct 01 13:36:25 compute-0 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Oct 01 13:36:27 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 01 13:36:27 compute-0 systemd[1]: Starting man-db-cache-update.service...
Oct 01 13:36:27 compute-0 systemd[1]: Reloading.
Oct 01 13:36:27 compute-0 systemd-sysv-generator[49294]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:36:27 compute-0 systemd-rc-local-generator[49290]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:36:27 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Oct 01 13:36:27 compute-0 sudo[48764]: pam_unix(sudo:session): session closed for user root
Oct 01 13:36:28 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct 01 13:36:28 compute-0 systemd[1]: Finished man-db-cache-update.service.
Oct 01 13:36:28 compute-0 systemd[1]: run-r343121e54d9342579c3d2dea1faef53e.service: Deactivated successfully.
Oct 01 13:36:29 compute-0 sudo[49866]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llzuckfzxbdsyiasxhaknjeloohunqxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325789.177878-148-24172718189498/AnsiballZ_systemd.py'
Oct 01 13:36:29 compute-0 sudo[49866]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:36:30 compute-0 python3.9[49868]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 01 13:36:30 compute-0 systemd[1]: Reloading.
Oct 01 13:36:30 compute-0 systemd-rc-local-generator[49895]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:36:30 compute-0 systemd-sysv-generator[49901]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:36:30 compute-0 systemd[1]: Starting Open vSwitch Database Unit...
Oct 01 13:36:30 compute-0 chown[49909]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Oct 01 13:36:30 compute-0 ovs-ctl[49914]: /etc/openvswitch/conf.db does not exist ... (warning).
Oct 01 13:36:30 compute-0 ovs-ctl[49914]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Oct 01 13:36:30 compute-0 ovs-ctl[49914]: Starting ovsdb-server [  OK  ]
Oct 01 13:36:30 compute-0 ovs-vsctl[49963]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Oct 01 13:36:30 compute-0 ovs-vsctl[49983]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"10cf9814-09fa-4bad-879a-270f9b64eda3\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Oct 01 13:36:30 compute-0 ovs-ctl[49914]: Configuring Open vSwitch system IDs [  OK  ]
Oct 01 13:36:30 compute-0 ovs-vsctl[49989]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Oct 01 13:36:30 compute-0 ovs-ctl[49914]: Enabling remote OVSDB managers [  OK  ]
Oct 01 13:36:30 compute-0 systemd[1]: Started Open vSwitch Database Unit.
Oct 01 13:36:30 compute-0 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Oct 01 13:36:31 compute-0 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Oct 01 13:36:31 compute-0 systemd[1]: Starting Open vSwitch Forwarding Unit...
Oct 01 13:36:31 compute-0 kernel: openvswitch: Open vSwitch switching datapath
Oct 01 13:36:31 compute-0 ovs-ctl[50034]: Inserting openvswitch module [  OK  ]
Oct 01 13:36:31 compute-0 ovs-ctl[50003]: Starting ovs-vswitchd [  OK  ]
Oct 01 13:36:31 compute-0 ovs-vsctl[50051]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Oct 01 13:36:31 compute-0 ovs-ctl[50003]: Enabling remote OVSDB managers [  OK  ]
Oct 01 13:36:31 compute-0 systemd[1]: Started Open vSwitch Forwarding Unit.
Oct 01 13:36:31 compute-0 systemd[1]: Starting Open vSwitch...
Oct 01 13:36:31 compute-0 systemd[1]: Finished Open vSwitch.
Oct 01 13:36:31 compute-0 sudo[49866]: pam_unix(sudo:session): session closed for user root
Oct 01 13:36:32 compute-0 python3.9[50203]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 01 13:36:33 compute-0 sudo[50353]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfihhacdtbxwuhtozomeidptxqdrawhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325792.5817776-184-176698178528618/AnsiballZ_sefcontext.py'
Oct 01 13:36:33 compute-0 sudo[50353]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:36:33 compute-0 python3.9[50355]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Oct 01 13:36:34 compute-0 kernel: SELinux:  Converting 2741 SID table entries...
Oct 01 13:36:34 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Oct 01 13:36:34 compute-0 kernel: SELinux:  policy capability open_perms=1
Oct 01 13:36:34 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Oct 01 13:36:34 compute-0 kernel: SELinux:  policy capability always_check_network=0
Oct 01 13:36:34 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 01 13:36:34 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 01 13:36:34 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 01 13:36:35 compute-0 sudo[50353]: pam_unix(sudo:session): session closed for user root
Oct 01 13:36:36 compute-0 python3.9[50510]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 01 13:36:37 compute-0 sudo[50666]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfjymzuzdbsorpmjlcqjiczkprkwbmlo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325796.7133517-220-93510741095567/AnsiballZ_dnf.py'
Oct 01 13:36:37 compute-0 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Oct 01 13:36:37 compute-0 sudo[50666]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:36:37 compute-0 python3.9[50668]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 01 13:36:38 compute-0 sudo[50666]: pam_unix(sudo:session): session closed for user root
Oct 01 13:36:39 compute-0 sudo[50819]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adwkxypalvteewkiaodzcidkglgszxre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325798.8168693-236-145905426227513/AnsiballZ_command.py'
Oct 01 13:36:39 compute-0 sudo[50819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:36:39 compute-0 python3.9[50821]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:36:40 compute-0 sudo[50819]: pam_unix(sudo:session): session closed for user root
Oct 01 13:36:40 compute-0 sudo[51106]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vgfrlxgijyvvduhrwhvhlhdzoippqfvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325800.502184-252-156944153783165/AnsiballZ_file.py'
Oct 01 13:36:40 compute-0 sudo[51106]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:36:41 compute-0 python3.9[51108]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct 01 13:36:41 compute-0 sudo[51106]: pam_unix(sudo:session): session closed for user root
Oct 01 13:36:42 compute-0 python3.9[51258]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 01 13:36:42 compute-0 sudo[51410]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjwfgopixikjavcnqoglkfbsogrfqjyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325802.3096309-284-10338710931996/AnsiballZ_dnf.py'
Oct 01 13:36:42 compute-0 sudo[51410]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:36:42 compute-0 python3.9[51412]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 01 13:36:44 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 01 13:36:44 compute-0 systemd[1]: Starting man-db-cache-update.service...
Oct 01 13:36:44 compute-0 systemd[1]: Reloading.
Oct 01 13:36:44 compute-0 systemd-rc-local-generator[51452]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:36:44 compute-0 systemd-sysv-generator[51455]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:36:44 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Oct 01 13:36:45 compute-0 sudo[51410]: pam_unix(sudo:session): session closed for user root
Oct 01 13:36:45 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct 01 13:36:45 compute-0 systemd[1]: Finished man-db-cache-update.service.
Oct 01 13:36:45 compute-0 systemd[1]: run-r3ec7666fb68c4a48b726382026495c1d.service: Deactivated successfully.
Oct 01 13:36:45 compute-0 sudo[51727]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ycaowvrziybnpvimaglxyisosxzchsuo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325805.5076492-300-229141753770366/AnsiballZ_systemd.py'
Oct 01 13:36:45 compute-0 sudo[51727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:36:46 compute-0 python3.9[51729]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 01 13:36:46 compute-0 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Oct 01 13:36:46 compute-0 systemd[1]: Stopped Network Manager Wait Online.
Oct 01 13:36:46 compute-0 systemd[1]: Stopping Network Manager Wait Online...
Oct 01 13:36:46 compute-0 systemd[1]: Stopping Network Manager...
Oct 01 13:36:46 compute-0 NetworkManager[3964]: <info>  [1759325806.2409] caught SIGTERM, shutting down normally.
Oct 01 13:36:46 compute-0 NetworkManager[3964]: <info>  [1759325806.2429] dhcp4 (eth0): canceled DHCP transaction
Oct 01 13:36:46 compute-0 NetworkManager[3964]: <info>  [1759325806.2429] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct 01 13:36:46 compute-0 NetworkManager[3964]: <info>  [1759325806.2430] dhcp4 (eth0): state changed no lease
Oct 01 13:36:46 compute-0 NetworkManager[3964]: <info>  [1759325806.2433] manager: NetworkManager state is now CONNECTED_SITE
Oct 01 13:36:46 compute-0 NetworkManager[3964]: <info>  [1759325806.2520] exiting (success)
Oct 01 13:36:46 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct 01 13:36:46 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct 01 13:36:46 compute-0 systemd[1]: NetworkManager.service: Deactivated successfully.
Oct 01 13:36:46 compute-0 systemd[1]: Stopped Network Manager.
Oct 01 13:36:46 compute-0 systemd[1]: NetworkManager.service: Consumed 12.212s CPU time, 4.0M memory peak, read 0B from disk, written 43.0K to disk.
Oct 01 13:36:46 compute-0 systemd[1]: Starting Network Manager...
Oct 01 13:36:46 compute-0 NetworkManager[51741]: <info>  [1759325806.3371] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:1fa3a04c-5158-4a26-9fa0-b8b34ae08d38)
Oct 01 13:36:46 compute-0 NetworkManager[51741]: <info>  [1759325806.3374] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Oct 01 13:36:46 compute-0 NetworkManager[51741]: <info>  [1759325806.3425] manager[0x55943a3e1090]: monitoring kernel firmware directory '/lib/firmware'.
Oct 01 13:36:46 compute-0 systemd[1]: Starting Hostname Service...
Oct 01 13:36:46 compute-0 systemd[1]: Started Hostname Service.
Oct 01 13:36:46 compute-0 NetworkManager[51741]: <info>  [1759325806.4522] hostname: hostname: using hostnamed
Oct 01 13:36:46 compute-0 NetworkManager[51741]: <info>  [1759325806.4524] hostname: static hostname changed from (none) to "compute-0"
Oct 01 13:36:46 compute-0 NetworkManager[51741]: <info>  [1759325806.4528] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Oct 01 13:36:46 compute-0 NetworkManager[51741]: <info>  [1759325806.4531] manager[0x55943a3e1090]: rfkill: Wi-Fi hardware radio set enabled
Oct 01 13:36:46 compute-0 NetworkManager[51741]: <info>  [1759325806.4531] manager[0x55943a3e1090]: rfkill: WWAN hardware radio set enabled
Oct 01 13:36:46 compute-0 NetworkManager[51741]: <info>  [1759325806.4549] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-ovs.so)
Oct 01 13:36:46 compute-0 NetworkManager[51741]: <info>  [1759325806.4557] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Oct 01 13:36:46 compute-0 NetworkManager[51741]: <info>  [1759325806.4557] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Oct 01 13:36:46 compute-0 NetworkManager[51741]: <info>  [1759325806.4558] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Oct 01 13:36:46 compute-0 NetworkManager[51741]: <info>  [1759325806.4558] manager: Networking is enabled by state file
Oct 01 13:36:46 compute-0 NetworkManager[51741]: <info>  [1759325806.4560] settings: Loaded settings plugin: keyfile (internal)
Oct 01 13:36:46 compute-0 NetworkManager[51741]: <info>  [1759325806.4563] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Oct 01 13:36:46 compute-0 NetworkManager[51741]: <info>  [1759325806.4585] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Oct 01 13:36:46 compute-0 NetworkManager[51741]: <info>  [1759325806.4594] dhcp: init: Using DHCP client 'internal'
Oct 01 13:36:46 compute-0 NetworkManager[51741]: <info>  [1759325806.4597] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Oct 01 13:36:46 compute-0 NetworkManager[51741]: <info>  [1759325806.4600] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 01 13:36:46 compute-0 NetworkManager[51741]: <info>  [1759325806.4606] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Oct 01 13:36:46 compute-0 NetworkManager[51741]: <info>  [1759325806.4614] device (lo): Activation: starting connection 'lo' (d1361516-740f-4fdb-ad0c-6174cd593c78)
Oct 01 13:36:46 compute-0 NetworkManager[51741]: <info>  [1759325806.4620] device (eth0): carrier: link connected
Oct 01 13:36:46 compute-0 NetworkManager[51741]: <info>  [1759325806.4623] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Oct 01 13:36:46 compute-0 NetworkManager[51741]: <info>  [1759325806.4627] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Oct 01 13:36:46 compute-0 NetworkManager[51741]: <info>  [1759325806.4627] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Oct 01 13:36:46 compute-0 NetworkManager[51741]: <info>  [1759325806.4633] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Oct 01 13:36:46 compute-0 NetworkManager[51741]: <info>  [1759325806.4638] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Oct 01 13:36:46 compute-0 NetworkManager[51741]: <info>  [1759325806.4644] device (eth1): carrier: link connected
Oct 01 13:36:46 compute-0 NetworkManager[51741]: <info>  [1759325806.4647] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Oct 01 13:36:46 compute-0 NetworkManager[51741]: <info>  [1759325806.4651] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (601254e3-3abf-5e1b-b4d3-7e1a095eff98) (indicated)
Oct 01 13:36:46 compute-0 NetworkManager[51741]: <info>  [1759325806.4651] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Oct 01 13:36:46 compute-0 NetworkManager[51741]: <info>  [1759325806.4655] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Oct 01 13:36:46 compute-0 NetworkManager[51741]: <info>  [1759325806.4661] device (eth1): Activation: starting connection 'ci-private-network' (601254e3-3abf-5e1b-b4d3-7e1a095eff98)
Oct 01 13:36:46 compute-0 NetworkManager[51741]: <info>  [1759325806.4666] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Oct 01 13:36:46 compute-0 systemd[1]: Started Network Manager.
Oct 01 13:36:46 compute-0 NetworkManager[51741]: <info>  [1759325806.4672] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Oct 01 13:36:46 compute-0 NetworkManager[51741]: <info>  [1759325806.4674] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Oct 01 13:36:46 compute-0 NetworkManager[51741]: <info>  [1759325806.4676] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Oct 01 13:36:46 compute-0 NetworkManager[51741]: <info>  [1759325806.4685] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Oct 01 13:36:46 compute-0 NetworkManager[51741]: <info>  [1759325806.4692] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Oct 01 13:36:46 compute-0 NetworkManager[51741]: <info>  [1759325806.4697] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Oct 01 13:36:46 compute-0 NetworkManager[51741]: <info>  [1759325806.4701] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Oct 01 13:36:46 compute-0 NetworkManager[51741]: <info>  [1759325806.4710] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Oct 01 13:36:46 compute-0 NetworkManager[51741]: <info>  [1759325806.4725] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Oct 01 13:36:46 compute-0 NetworkManager[51741]: <info>  [1759325806.4730] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct 01 13:36:46 compute-0 NetworkManager[51741]: <info>  [1759325806.4744] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Oct 01 13:36:46 compute-0 NetworkManager[51741]: <info>  [1759325806.4767] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Oct 01 13:36:46 compute-0 NetworkManager[51741]: <info>  [1759325806.4784] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Oct 01 13:36:46 compute-0 NetworkManager[51741]: <info>  [1759325806.4790] dhcp4 (eth0): state changed new lease, address=38.102.83.163
Oct 01 13:36:46 compute-0 NetworkManager[51741]: <info>  [1759325806.4796] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Oct 01 13:36:46 compute-0 NetworkManager[51741]: <info>  [1759325806.4804] device (lo): Activation: successful, device activated.
Oct 01 13:36:46 compute-0 NetworkManager[51741]: <info>  [1759325806.4820] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Oct 01 13:36:46 compute-0 systemd[1]: Starting Network Manager Wait Online...
Oct 01 13:36:46 compute-0 sudo[51727]: pam_unix(sudo:session): session closed for user root
Oct 01 13:36:46 compute-0 NetworkManager[51741]: <info>  [1759325806.5660] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Oct 01 13:36:46 compute-0 NetworkManager[51741]: <info>  [1759325806.5681] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Oct 01 13:36:46 compute-0 NetworkManager[51741]: <info>  [1759325806.5688] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Oct 01 13:36:46 compute-0 NetworkManager[51741]: <info>  [1759325806.5691] manager: NetworkManager state is now CONNECTED_LOCAL
Oct 01 13:36:46 compute-0 NetworkManager[51741]: <info>  [1759325806.5695] device (eth1): Activation: successful, device activated.
Oct 01 13:36:46 compute-0 NetworkManager[51741]: <info>  [1759325806.5714] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Oct 01 13:36:46 compute-0 NetworkManager[51741]: <info>  [1759325806.5716] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Oct 01 13:36:46 compute-0 NetworkManager[51741]: <info>  [1759325806.5721] manager: NetworkManager state is now CONNECTED_SITE
Oct 01 13:36:46 compute-0 NetworkManager[51741]: <info>  [1759325806.5728] device (eth0): Activation: successful, device activated.
Oct 01 13:36:46 compute-0 NetworkManager[51741]: <info>  [1759325806.5738] manager: NetworkManager state is now CONNECTED_GLOBAL
Oct 01 13:36:46 compute-0 NetworkManager[51741]: <info>  [1759325806.5765] manager: startup complete
Oct 01 13:36:46 compute-0 systemd[1]: Finished Network Manager Wait Online.
Oct 01 13:36:47 compute-0 sudo[51953]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkbsgacfdkfxkrhxyscborhengihnmnr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325806.7187054-316-111914951749538/AnsiballZ_dnf.py'
Oct 01 13:36:47 compute-0 sudo[51953]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:36:47 compute-0 python3.9[51955]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 01 13:36:52 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 01 13:36:52 compute-0 systemd[1]: Starting man-db-cache-update.service...
Oct 01 13:36:52 compute-0 systemd[1]: Reloading.
Oct 01 13:36:52 compute-0 systemd-rc-local-generator[52003]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:36:52 compute-0 systemd-sysv-generator[52011]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:36:53 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Oct 01 13:36:53 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct 01 13:36:53 compute-0 systemd[1]: Finished man-db-cache-update.service.
Oct 01 13:36:53 compute-0 systemd[1]: run-r21e9f589a9c04b8aaa136dd961182fc7.service: Deactivated successfully.
Oct 01 13:36:53 compute-0 sudo[51953]: pam_unix(sudo:session): session closed for user root
Oct 01 13:36:54 compute-0 sudo[52415]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-roxwmmibggqnoudxkyoobedpoihodnhq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325814.687189-340-194005175927817/AnsiballZ_stat.py'
Oct 01 13:36:55 compute-0 sudo[52415]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:36:55 compute-0 python3.9[52417]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 01 13:36:55 compute-0 sudo[52415]: pam_unix(sudo:session): session closed for user root
Oct 01 13:36:55 compute-0 sudo[52567]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gthynarwkmrautgpuletqdhbgfeqavuf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325815.4859076-358-219333412542115/AnsiballZ_ini_file.py'
Oct 01 13:36:55 compute-0 sudo[52567]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:36:56 compute-0 python3.9[52569]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:36:56 compute-0 sudo[52567]: pam_unix(sudo:session): session closed for user root
Oct 01 13:36:56 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct 01 13:36:56 compute-0 sudo[52722]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwugsshpqyaultswkbslkhrbrcqoywsk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325816.4652495-378-175210266207897/AnsiballZ_ini_file.py'
Oct 01 13:36:56 compute-0 sudo[52722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:36:57 compute-0 python3.9[52724]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:36:57 compute-0 sudo[52722]: pam_unix(sudo:session): session closed for user root
Oct 01 13:36:57 compute-0 sudo[52874]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vtimktotblbczhtiopemwdijqtsdbths ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325817.2608478-378-202103718463722/AnsiballZ_ini_file.py'
Oct 01 13:36:57 compute-0 sudo[52874]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:36:57 compute-0 python3.9[52876]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:36:57 compute-0 sudo[52874]: pam_unix(sudo:session): session closed for user root
Oct 01 13:36:58 compute-0 sudo[53026]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whjcdowxulgpgojjuwxrrcipfcxwiiri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325817.9766147-408-96270292579898/AnsiballZ_ini_file.py'
Oct 01 13:36:58 compute-0 sudo[53026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:36:58 compute-0 python3.9[53028]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:36:58 compute-0 sudo[53026]: pam_unix(sudo:session): session closed for user root
Oct 01 13:36:58 compute-0 sudo[53178]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezcudzkknxkiqrirghckwsxbqwcgrapi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325818.5829113-408-205819931877581/AnsiballZ_ini_file.py'
Oct 01 13:36:58 compute-0 sudo[53178]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:36:59 compute-0 python3.9[53180]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:36:59 compute-0 sudo[53178]: pam_unix(sudo:session): session closed for user root
Oct 01 13:36:59 compute-0 sudo[53330]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlytvvogidrwoarovcaapwymjapvjosz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325819.338111-438-45632007423812/AnsiballZ_stat.py'
Oct 01 13:36:59 compute-0 sudo[53330]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:36:59 compute-0 python3.9[53332]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:36:59 compute-0 sudo[53330]: pam_unix(sudo:session): session closed for user root
Oct 01 13:37:00 compute-0 sudo[53453]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qowyjfeepjgdknwfujfdxhgemyxlgcpi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325819.338111-438-45632007423812/AnsiballZ_copy.py'
Oct 01 13:37:00 compute-0 sudo[53453]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:37:00 compute-0 python3.9[53455]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1759325819.338111-438-45632007423812/.source _original_basename=.9fke6yd6 follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:37:00 compute-0 sudo[53453]: pam_unix(sudo:session): session closed for user root
Oct 01 13:37:01 compute-0 sudo[53605]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbahphuursvveypkwokvqhkfirqycyrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325820.774361-468-254857366038540/AnsiballZ_file.py'
Oct 01 13:37:01 compute-0 sudo[53605]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:37:01 compute-0 python3.9[53607]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:37:01 compute-0 sudo[53605]: pam_unix(sudo:session): session closed for user root
Oct 01 13:37:02 compute-0 sudo[53757]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-espnrrisohsgbfhtsdhituiamvkeesig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325821.5068567-484-131411461286893/AnsiballZ_edpm_os_net_config_mappings.py'
Oct 01 13:37:02 compute-0 sudo[53757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:37:02 compute-0 python3.9[53759]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Oct 01 13:37:02 compute-0 sudo[53757]: pam_unix(sudo:session): session closed for user root
Oct 01 13:37:02 compute-0 sudo[53909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfqvoprsihghiqgaqcekbfilxnktezsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325822.5643356-502-11849779089140/AnsiballZ_file.py'
Oct 01 13:37:02 compute-0 sudo[53909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:37:03 compute-0 python3.9[53911]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:37:03 compute-0 sudo[53909]: pam_unix(sudo:session): session closed for user root
Oct 01 13:37:03 compute-0 sudo[54061]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-plqvsogajaxyqyjkautgqynvhadaasbc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325823.4901264-522-252948872920783/AnsiballZ_stat.py'
Oct 01 13:37:03 compute-0 sudo[54061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:37:04 compute-0 sudo[54061]: pam_unix(sudo:session): session closed for user root
Oct 01 13:37:04 compute-0 sudo[54184]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjxjyivzldeioasjhwwrbbirjudfmeyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325823.4901264-522-252948872920783/AnsiballZ_copy.py'
Oct 01 13:37:04 compute-0 sudo[54184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:37:04 compute-0 sudo[54184]: pam_unix(sudo:session): session closed for user root
Oct 01 13:37:05 compute-0 sudo[54336]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezibwaftskwrwvehqgzabisagrfwglgl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325824.8899114-552-69747695126287/AnsiballZ_slurp.py'
Oct 01 13:37:05 compute-0 sudo[54336]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:37:05 compute-0 python3.9[54338]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Oct 01 13:37:05 compute-0 sudo[54336]: pam_unix(sudo:session): session closed for user root
Oct 01 13:37:06 compute-0 sudo[54511]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oijcxvfslauehkfzmgmjzrivbivcwfxd ; ANSIBLE_ASYNC_DIR=\'~/.ansible_async\' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325826.0599399-570-119409676482976/async_wrapper.py j629828002994 300 /home/zuul/.ansible/tmp/ansible-tmp-1759325826.0599399-570-119409676482976/AnsiballZ_edpm_os_net_config.py _'
Oct 01 13:37:06 compute-0 sudo[54511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:37:06 compute-0 ansible-async_wrapper.py[54513]: Invoked with j629828002994 300 /home/zuul/.ansible/tmp/ansible-tmp-1759325826.0599399-570-119409676482976/AnsiballZ_edpm_os_net_config.py _
Oct 01 13:37:06 compute-0 ansible-async_wrapper.py[54516]: Starting module and watcher
Oct 01 13:37:06 compute-0 ansible-async_wrapper.py[54516]: Start watching 54517 (300)
Oct 01 13:37:06 compute-0 ansible-async_wrapper.py[54517]: Start module (54517)
Oct 01 13:37:06 compute-0 ansible-async_wrapper.py[54513]: Return async_wrapper task started.
Oct 01 13:37:07 compute-0 sudo[54511]: pam_unix(sudo:session): session closed for user root
Oct 01 13:37:07 compute-0 python3.9[54518]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Oct 01 13:37:07 compute-0 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Oct 01 13:37:07 compute-0 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Oct 01 13:37:07 compute-0 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Oct 01 13:37:07 compute-0 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Oct 01 13:37:07 compute-0 kernel: cfg80211: failed to load regulatory.db
Oct 01 13:37:08 compute-0 NetworkManager[51741]: <info>  [1759325828.9748] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=54519 uid=0 result="success"
Oct 01 13:37:08 compute-0 NetworkManager[51741]: <info>  [1759325828.9763] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=54519 uid=0 result="success"
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0211] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0212] audit: op="connection-add" uuid="e03ebe12-9f90-4daa-a806-d9a1dbab4a6a" name="br-ex-br" pid=54519 uid=0 result="success"
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0225] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0226] audit: op="connection-add" uuid="9bfdbf69-357e-4362-b67b-e20c62f704e2" name="br-ex-port" pid=54519 uid=0 result="success"
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0236] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0237] audit: op="connection-add" uuid="4aa8e2b0-4700-4674-b5f6-2a36f6ca44e6" name="eth1-port" pid=54519 uid=0 result="success"
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0246] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0247] audit: op="connection-add" uuid="182c2025-a5e5-4e10-82a8-90a7b80531e0" name="vlan20-port" pid=54519 uid=0 result="success"
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0257] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0258] audit: op="connection-add" uuid="154e916e-62fd-40bd-9d14-e5da10b0b7bc" name="vlan21-port" pid=54519 uid=0 result="success"
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0267] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0268] audit: op="connection-add" uuid="43cb9610-df39-4184-b526-a73fd9ee202a" name="vlan22-port" pid=54519 uid=0 result="success"
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0284] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="ipv6.addr-gen-mode,ipv6.method,ipv6.dhcp-timeout,ipv4.dhcp-client-id,ipv4.dhcp-timeout,802-3-ethernet.mtu,connection.timestamp,connection.autoconnect-priority" pid=54519 uid=0 result="success"
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0298] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0299] audit: op="connection-add" uuid="0141981b-1d9e-4a10-9016-e51349860d2c" name="br-ex-if" pid=54519 uid=0 result="success"
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0404] audit: op="connection-update" uuid="601254e3-3abf-5e1b-b4d3-7e1a095eff98" name="ci-private-network" args="ovs-interface.type,ipv4.routing-rules,ipv4.routes,ipv4.addresses,ipv4.dns,ipv4.never-default,ipv4.method,ipv6.routing-rules,ipv6.routes,ipv6.method,ipv6.addr-gen-mode,ipv6.addresses,ipv6.dns,ovs-external-ids.data,connection.port-type,connection.master,connection.timestamp,connection.slave-type,connection.controller" pid=54519 uid=0 result="success"
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0418] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0419] audit: op="connection-add" uuid="c7e447c7-cf09-431a-87d2-6ccb2cd43e2c" name="vlan20-if" pid=54519 uid=0 result="success"
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0434] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0435] audit: op="connection-add" uuid="cfe2b24c-748e-4b9e-9541-9d12baac7943" name="vlan21-if" pid=54519 uid=0 result="success"
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0448] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0449] audit: op="connection-add" uuid="3229a711-1894-42ff-b3bd-c8fa87d0cf1c" name="vlan22-if" pid=54519 uid=0 result="success"
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0459] audit: op="connection-delete" uuid="899ff706-8c33-3033-a1ad-6ae636ae4542" name="Wired connection 1" pid=54519 uid=0 result="success"
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0469] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0477] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0480] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (e03ebe12-9f90-4daa-a806-d9a1dbab4a6a)
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0480] audit: op="connection-activate" uuid="e03ebe12-9f90-4daa-a806-d9a1dbab4a6a" name="br-ex-br" pid=54519 uid=0 result="success"
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0482] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0487] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0490] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (9bfdbf69-357e-4362-b67b-e20c62f704e2)
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0492] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0496] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0499] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (4aa8e2b0-4700-4674-b5f6-2a36f6ca44e6)
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0500] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0505] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0508] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (182c2025-a5e5-4e10-82a8-90a7b80531e0)
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0509] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0514] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0517] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (154e916e-62fd-40bd-9d14-e5da10b0b7bc)
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0519] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0523] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0526] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (43cb9610-df39-4184-b526-a73fd9ee202a)
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0527] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0529] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0530] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0536] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0539] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0541] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (0141981b-1d9e-4a10-9016-e51349860d2c)
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0541] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0544] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0545] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0546] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0546] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0553] device (eth1): disconnecting for new activation request.
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0554] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0557] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0558] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0559] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0561] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0565] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0569] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (c7e447c7-cf09-431a-87d2-6ccb2cd43e2c)
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0569] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0571] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0573] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0574] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0576] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0580] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0584] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (cfe2b24c-748e-4b9e-9541-9d12baac7943)
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0585] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0587] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0589] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0590] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0592] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0596] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0600] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (3229a711-1894-42ff-b3bd-c8fa87d0cf1c)
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0600] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0603] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0605] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0606] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0607] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0617] audit: op="device-reapply" interface="eth0" ifindex=2 args="ipv6.addr-gen-mode,ipv6.method,ipv4.dhcp-client-id,ipv4.dhcp-timeout,802-3-ethernet.mtu,connection.autoconnect-priority" pid=54519 uid=0 result="success"
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0618] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0620] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0622] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0627] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0629] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0632] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0634] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0636] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0641] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 kernel: ovs-system: entered promiscuous mode
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0644] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0647] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0648] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0652] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 systemd-udevd[54526]: Network interface NamePolicy= disabled on kernel command line.
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0656] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0659] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0660] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0665] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0669] dhcp4 (eth0): canceled DHCP transaction
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0669] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0669] dhcp4 (eth0): state changed no lease
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0671] dhcp4 (eth0): activation: beginning transaction (no timeout)
Oct 01 13:37:09 compute-0 kernel: Timeout policy base is empty
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0686] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0689] audit: op="device-reapply" interface="eth1" ifindex=3 pid=54519 uid=0 result="fail" reason="Device is not activated"
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0694] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Oct 01 13:37:09 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0745] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0750] dhcp4 (eth0): state changed new lease, address=38.102.83.163
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0760] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Oct 01 13:37:09 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0828] device (eth1): disconnecting for new activation request.
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0829] audit: op="connection-activate" uuid="601254e3-3abf-5e1b-b4d3-7e1a095eff98" name="ci-private-network" pid=54519 uid=0 result="success"
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0837] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0932] device (eth1): Activation: starting connection 'ci-private-network' (601254e3-3abf-5e1b-b4d3-7e1a095eff98)
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0937] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0953] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0958] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0963] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0968] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0974] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0975] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0976] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0977] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0979] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=54519 uid=0 result="success"
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0979] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0983] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0993] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.0998] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.1002] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.1006] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.1010] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.1014] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.1018] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.1023] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.1027] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.1031] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.1038] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.1041] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 kernel: br-ex: entered promiscuous mode
Oct 01 13:37:09 compute-0 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Oct 01 13:37:09 compute-0 kernel: vlan22: entered promiscuous mode
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.1090] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.1092] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.1097] device (eth1): Activation: successful, device activated.
Oct 01 13:37:09 compute-0 systemd-udevd[54525]: Network interface NamePolicy= disabled on kernel command line.
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.1208] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.1222] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 kernel: vlan21: entered promiscuous mode
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.1254] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.1268] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.1274] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.1281] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.1287] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.1295] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.1297] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 kernel: vlan20: entered promiscuous mode
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.1303] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.1497] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.1500] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.1524] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.1529] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.1554] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.1555] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.1562] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.1571] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.1573] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct 01 13:37:09 compute-0 NetworkManager[51741]: <info>  [1759325829.1579] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Oct 01 13:37:10 compute-0 NetworkManager[51741]: <info>  [1759325830.3040] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=54519 uid=0 result="success"
Oct 01 13:37:10 compute-0 NetworkManager[51741]: <info>  [1759325830.4648] checkpoint[0x55943a3b8950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Oct 01 13:37:10 compute-0 NetworkManager[51741]: <info>  [1759325830.4649] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=54519 uid=0 result="success"
Oct 01 13:37:10 compute-0 sudo[54852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnioujaaqmmxsfbocfvwgszxdsmggjsi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325830.1291254-570-219605213515051/AnsiballZ_async_status.py'
Oct 01 13:37:10 compute-0 sudo[54852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:37:10 compute-0 NetworkManager[51741]: <info>  [1759325830.7175] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=54519 uid=0 result="success"
Oct 01 13:37:10 compute-0 NetworkManager[51741]: <info>  [1759325830.7187] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=54519 uid=0 result="success"
Oct 01 13:37:10 compute-0 python3.9[54854]: ansible-ansible.legacy.async_status Invoked with jid=j629828002994.54513 mode=status _async_dir=/root/.ansible_async
Oct 01 13:37:10 compute-0 sudo[54852]: pam_unix(sudo:session): session closed for user root
Oct 01 13:37:10 compute-0 NetworkManager[51741]: <info>  [1759325830.9163] audit: op="networking-control" arg="global-dns-configuration" pid=54519 uid=0 result="success"
Oct 01 13:37:10 compute-0 NetworkManager[51741]: <info>  [1759325830.9228] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Oct 01 13:37:10 compute-0 NetworkManager[51741]: <info>  [1759325830.9290] audit: op="networking-control" arg="global-dns-configuration" pid=54519 uid=0 result="success"
Oct 01 13:37:10 compute-0 NetworkManager[51741]: <info>  [1759325830.9316] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=54519 uid=0 result="success"
Oct 01 13:37:11 compute-0 NetworkManager[51741]: <info>  [1759325831.0690] checkpoint[0x55943a3b8a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Oct 01 13:37:11 compute-0 NetworkManager[51741]: <info>  [1759325831.0694] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=54519 uid=0 result="success"
Oct 01 13:37:11 compute-0 ansible-async_wrapper.py[54517]: Module complete (54517)
Oct 01 13:37:11 compute-0 ansible-async_wrapper.py[54516]: Done in kid B.
Oct 01 13:37:14 compute-0 sudo[54956]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swhmcipjzipgfjmfsuytvdnglmxrdquy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325830.1291254-570-219605213515051/AnsiballZ_async_status.py'
Oct 01 13:37:14 compute-0 sudo[54956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:37:14 compute-0 python3.9[54958]: ansible-ansible.legacy.async_status Invoked with jid=j629828002994.54513 mode=status _async_dir=/root/.ansible_async
Oct 01 13:37:14 compute-0 sudo[54956]: pam_unix(sudo:session): session closed for user root
Oct 01 13:37:14 compute-0 sudo[55056]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-redpwlzzpiahxhenxmjirojlzhwngpmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325830.1291254-570-219605213515051/AnsiballZ_async_status.py'
Oct 01 13:37:14 compute-0 sudo[55056]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:37:14 compute-0 python3.9[55058]: ansible-ansible.legacy.async_status Invoked with jid=j629828002994.54513 mode=cleanup _async_dir=/root/.ansible_async
Oct 01 13:37:14 compute-0 sudo[55056]: pam_unix(sudo:session): session closed for user root
Oct 01 13:37:15 compute-0 sudo[55208]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utmjsmgzrbclreyfhvqefuroeyljirhp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325835.2183805-624-91384122573232/AnsiballZ_stat.py'
Oct 01 13:37:15 compute-0 sudo[55208]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:37:15 compute-0 python3.9[55210]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:37:15 compute-0 sudo[55208]: pam_unix(sudo:session): session closed for user root
Oct 01 13:37:16 compute-0 sudo[55331]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uoclqfvurfsthafkxshfmbmjlsnhhhpn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325835.2183805-624-91384122573232/AnsiballZ_copy.py'
Oct 01 13:37:16 compute-0 sudo[55331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:37:16 compute-0 python3.9[55333]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759325835.2183805-624-91384122573232/.source.returncode _original_basename=.cu_u7btu follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:37:16 compute-0 sudo[55331]: pam_unix(sudo:session): session closed for user root
Oct 01 13:37:16 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct 01 13:37:17 compute-0 sudo[55485]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgpssjenwcavknhzieoddtrueqwiffzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325836.6853988-656-231146388505266/AnsiballZ_stat.py'
Oct 01 13:37:17 compute-0 sudo[55485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:37:17 compute-0 python3.9[55487]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:37:17 compute-0 sudo[55485]: pam_unix(sudo:session): session closed for user root
Oct 01 13:37:17 compute-0 sudo[55609]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwxkvgixzkqhirfyvfnoozfoepkmpzxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325836.6853988-656-231146388505266/AnsiballZ_copy.py'
Oct 01 13:37:17 compute-0 sudo[55609]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:37:17 compute-0 python3.9[55611]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759325836.6853988-656-231146388505266/.source.cfg _original_basename=.zz_x1pxu follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:37:17 compute-0 sudo[55609]: pam_unix(sudo:session): session closed for user root
Oct 01 13:37:18 compute-0 sudo[55761]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qcgmmnreramflfijfznvdhvgyppcpmda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325838.2636-686-245634905336118/AnsiballZ_systemd.py'
Oct 01 13:37:18 compute-0 sudo[55761]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:37:19 compute-0 python3.9[55763]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 01 13:37:19 compute-0 systemd[1]: Reloading Network Manager...
Oct 01 13:37:19 compute-0 NetworkManager[51741]: <info>  [1759325839.1062] audit: op="reload" arg="0" pid=55767 uid=0 result="success"
Oct 01 13:37:19 compute-0 NetworkManager[51741]: <info>  [1759325839.1073] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Oct 01 13:37:19 compute-0 systemd[1]: Reloaded Network Manager.
Oct 01 13:37:19 compute-0 sudo[55761]: pam_unix(sudo:session): session closed for user root
Oct 01 13:37:19 compute-0 sshd-session[47744]: Connection closed by 192.168.122.30 port 40418
Oct 01 13:37:19 compute-0 sshd-session[47741]: pam_unix(sshd:session): session closed for user zuul
Oct 01 13:37:19 compute-0 systemd[1]: session-12.scope: Deactivated successfully.
Oct 01 13:37:19 compute-0 systemd[1]: session-12.scope: Consumed 52.096s CPU time.
Oct 01 13:37:19 compute-0 systemd-logind[791]: Session 12 logged out. Waiting for processes to exit.
Oct 01 13:37:19 compute-0 systemd-logind[791]: Removed session 12.
Oct 01 13:37:24 compute-0 sshd-session[55798]: Accepted publickey for zuul from 192.168.122.30 port 34038 ssh2: ECDSA SHA256:G/wBH4NemtaB5A4Xrsc6R+GZmi6HC8VbviS/FKhdd8M
Oct 01 13:37:24 compute-0 systemd-logind[791]: New session 13 of user zuul.
Oct 01 13:37:24 compute-0 systemd[1]: Started Session 13 of User zuul.
Oct 01 13:37:24 compute-0 sshd-session[55798]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 01 13:37:26 compute-0 python3.9[55951]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 01 13:37:27 compute-0 python3.9[56105]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 01 13:37:28 compute-0 python3.9[56295]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:37:28 compute-0 sshd-session[55801]: Connection closed by 192.168.122.30 port 34038
Oct 01 13:37:28 compute-0 sshd-session[55798]: pam_unix(sshd:session): session closed for user zuul
Oct 01 13:37:28 compute-0 systemd[1]: session-13.scope: Deactivated successfully.
Oct 01 13:37:28 compute-0 systemd[1]: session-13.scope: Consumed 2.722s CPU time.
Oct 01 13:37:28 compute-0 systemd-logind[791]: Session 13 logged out. Waiting for processes to exit.
Oct 01 13:37:28 compute-0 systemd-logind[791]: Removed session 13.
Oct 01 13:37:29 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct 01 13:37:35 compute-0 sshd-session[56325]: Accepted publickey for zuul from 192.168.122.30 port 38220 ssh2: ECDSA SHA256:G/wBH4NemtaB5A4Xrsc6R+GZmi6HC8VbviS/FKhdd8M
Oct 01 13:37:35 compute-0 systemd-logind[791]: New session 14 of user zuul.
Oct 01 13:37:35 compute-0 systemd[1]: Started Session 14 of User zuul.
Oct 01 13:37:35 compute-0 sshd-session[56325]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 01 13:37:36 compute-0 python3.9[56478]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 01 13:37:37 compute-0 python3.9[56633]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 01 13:37:38 compute-0 sudo[56787]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fdnngcpvswdoxkhztaxkxritcdkbiskr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325858.21133-60-199695401820459/AnsiballZ_setup.py'
Oct 01 13:37:38 compute-0 sudo[56787]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:37:38 compute-0 python3.9[56789]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 01 13:37:39 compute-0 sudo[56787]: pam_unix(sudo:session): session closed for user root
Oct 01 13:37:39 compute-0 sudo[56871]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wztrbxqsoamlflueyopjfponrsplvfab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325858.21133-60-199695401820459/AnsiballZ_dnf.py'
Oct 01 13:37:39 compute-0 sudo[56871]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:37:40 compute-0 python3.9[56873]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 01 13:37:41 compute-0 sudo[56871]: pam_unix(sudo:session): session closed for user root
Oct 01 13:37:41 compute-0 sudo[57025]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhatendebwjcefiwwscvqwbhvfpvubfs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325861.4697165-84-104756176214064/AnsiballZ_setup.py'
Oct 01 13:37:41 compute-0 sudo[57025]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:37:42 compute-0 python3.9[57027]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 01 13:37:42 compute-0 sudo[57025]: pam_unix(sudo:session): session closed for user root
Oct 01 13:37:43 compute-0 sudo[57216]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aomwtvnvqxykuljshuczpwqajlvagodn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325862.93518-106-199618901693916/AnsiballZ_file.py'
Oct 01 13:37:43 compute-0 sudo[57216]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:37:43 compute-0 python3.9[57218]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:37:43 compute-0 sudo[57216]: pam_unix(sudo:session): session closed for user root
Oct 01 13:37:44 compute-0 sudo[57368]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fizrefekjhtgoiivkwwlnonrzryxhiqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325863.9898279-122-90106980614655/AnsiballZ_command.py'
Oct 01 13:37:44 compute-0 sudo[57368]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:37:44 compute-0 python3.9[57370]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:37:44 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 01 13:37:44 compute-0 sudo[57368]: pam_unix(sudo:session): session closed for user root
Oct 01 13:37:45 compute-0 sudo[57531]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tznaesadrabirppveqrlrdzxfydgpsly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325865.1453521-138-278032935550731/AnsiballZ_stat.py'
Oct 01 13:37:45 compute-0 sudo[57531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:37:45 compute-0 python3.9[57533]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:37:45 compute-0 sudo[57531]: pam_unix(sudo:session): session closed for user root
Oct 01 13:37:46 compute-0 sudo[57609]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlkrhpahopjojquitkoalwbjbqgtblal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325865.1453521-138-278032935550731/AnsiballZ_file.py'
Oct 01 13:37:46 compute-0 sudo[57609]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:37:46 compute-0 python3.9[57611]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:37:46 compute-0 sudo[57609]: pam_unix(sudo:session): session closed for user root
Oct 01 13:37:46 compute-0 sudo[57761]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yghbbrgbaurjrugrixnuybowwalwphwn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325866.645337-162-131251992975057/AnsiballZ_stat.py'
Oct 01 13:37:46 compute-0 sudo[57761]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:37:47 compute-0 python3.9[57763]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:37:47 compute-0 sudo[57761]: pam_unix(sudo:session): session closed for user root
Oct 01 13:37:47 compute-0 sudo[57839]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqflsxtnmfiuavejocmvukjwzlnlfyub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325866.645337-162-131251992975057/AnsiballZ_file.py'
Oct 01 13:37:47 compute-0 sudo[57839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:37:47 compute-0 python3.9[57841]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:37:47 compute-0 sudo[57839]: pam_unix(sudo:session): session closed for user root
Oct 01 13:37:48 compute-0 sudo[57991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwrtijomtksxdbormviruyhmmlgcmgwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325868.1481483-188-201665071123564/AnsiballZ_ini_file.py'
Oct 01 13:37:48 compute-0 sudo[57991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:37:48 compute-0 python3.9[57993]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:37:48 compute-0 sudo[57991]: pam_unix(sudo:session): session closed for user root
Oct 01 13:37:49 compute-0 sudo[58143]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fogwcrvrpcurdjsckerxzxfomzetmlbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325869.0821388-188-254137806023336/AnsiballZ_ini_file.py'
Oct 01 13:37:49 compute-0 sudo[58143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:37:49 compute-0 python3.9[58145]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:37:49 compute-0 sudo[58143]: pam_unix(sudo:session): session closed for user root
Oct 01 13:37:50 compute-0 sudo[58295]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-muhjjgipwrqhkzrkmxbqpdkdbkknkalx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325869.7748697-188-171680280066972/AnsiballZ_ini_file.py'
Oct 01 13:37:50 compute-0 sudo[58295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:37:50 compute-0 python3.9[58297]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:37:50 compute-0 sudo[58295]: pam_unix(sudo:session): session closed for user root
Oct 01 13:37:50 compute-0 sudo[58447]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umgvyljkkwioqcpsdqjxdgzvpdligilj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325870.5037384-188-67845297399356/AnsiballZ_ini_file.py'
Oct 01 13:37:50 compute-0 sudo[58447]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:37:51 compute-0 python3.9[58449]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:37:51 compute-0 sudo[58447]: pam_unix(sudo:session): session closed for user root
Oct 01 13:37:51 compute-0 sudo[58599]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkfyujttubbcdpxxuivpwtqqunaumeup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325871.329195-250-132170743146979/AnsiballZ_dnf.py'
Oct 01 13:37:51 compute-0 sudo[58599]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:37:51 compute-0 python3.9[58601]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 01 13:37:53 compute-0 sudo[58599]: pam_unix(sudo:session): session closed for user root
Oct 01 13:37:53 compute-0 sudo[58752]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adyttudwyshkjekdfuzmnjvsjrmqkzra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325873.4862714-272-171561913162114/AnsiballZ_setup.py'
Oct 01 13:37:53 compute-0 sudo[58752]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:37:54 compute-0 python3.9[58754]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 01 13:37:54 compute-0 sudo[58752]: pam_unix(sudo:session): session closed for user root
Oct 01 13:37:54 compute-0 sudo[58906]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntnjwdbyexihsgehtjkgqrufejdmaofs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325874.3366156-288-35475481239459/AnsiballZ_stat.py'
Oct 01 13:37:54 compute-0 sudo[58906]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:37:54 compute-0 python3.9[58908]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 01 13:37:54 compute-0 sudo[58906]: pam_unix(sudo:session): session closed for user root
Oct 01 13:37:55 compute-0 sudo[59058]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibwudvjcmnhcdeedxhtkcygxdedckbbe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325875.4004142-306-107547919428762/AnsiballZ_stat.py'
Oct 01 13:37:55 compute-0 sudo[59058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:37:55 compute-0 python3.9[59060]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 01 13:37:55 compute-0 sudo[59058]: pam_unix(sudo:session): session closed for user root
Oct 01 13:37:56 compute-0 sudo[59210]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ruayrpftezbnuqocwzvdhodxllnubrls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325876.2398486-326-38581281648938/AnsiballZ_service_facts.py'
Oct 01 13:37:56 compute-0 sudo[59210]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:37:56 compute-0 python3.9[59212]: ansible-service_facts Invoked
Oct 01 13:37:57 compute-0 network[59229]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 01 13:37:57 compute-0 network[59230]: 'network-scripts' will be removed from distribution in near future.
Oct 01 13:37:57 compute-0 network[59231]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 01 13:38:00 compute-0 sudo[59210]: pam_unix(sudo:session): session closed for user root
Oct 01 13:38:03 compute-0 sudo[59516]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dyisfsshxpoqokmdvadegtnpmdykdrqf ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1759325883.1039555-352-223910222818298/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1759325883.1039555-352-223910222818298/args'
Oct 01 13:38:03 compute-0 sudo[59516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:38:03 compute-0 sudo[59516]: pam_unix(sudo:session): session closed for user root
Oct 01 13:38:04 compute-0 sudo[59683]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dylygeuaurofkktwdsryizdmrwdtarhp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325884.0805967-374-58361515929723/AnsiballZ_dnf.py'
Oct 01 13:38:04 compute-0 sudo[59683]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:38:04 compute-0 python3.9[59685]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 01 13:38:05 compute-0 sudo[59683]: pam_unix(sudo:session): session closed for user root
Oct 01 13:38:07 compute-0 sudo[59836]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fehwophtrfthelrmmtxzxwqyhlgagera ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325886.351015-400-136165717920504/AnsiballZ_package_facts.py'
Oct 01 13:38:07 compute-0 sudo[59836]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:38:07 compute-0 python3.9[59838]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Oct 01 13:38:07 compute-0 sudo[59836]: pam_unix(sudo:session): session closed for user root
Oct 01 13:38:08 compute-0 sudo[59988]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfvtoyaquvvurtoruloxqxbstorgbgvv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325888.2477133-420-55007866643068/AnsiballZ_stat.py'
Oct 01 13:38:08 compute-0 sudo[59988]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:38:08 compute-0 python3.9[59990]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:38:08 compute-0 sudo[59988]: pam_unix(sudo:session): session closed for user root
Oct 01 13:38:09 compute-0 sudo[60113]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmucrbuhdcmqfxglpcchwfksillgwpci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325888.2477133-420-55007866643068/AnsiballZ_copy.py'
Oct 01 13:38:09 compute-0 sudo[60113]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:38:09 compute-0 python3.9[60115]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759325888.2477133-420-55007866643068/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:38:09 compute-0 sudo[60113]: pam_unix(sudo:session): session closed for user root
Oct 01 13:38:10 compute-0 sudo[60267]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ulngbkrrfgurbccwqdltvanozjeuyakw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325890.0070467-450-175779167643094/AnsiballZ_stat.py'
Oct 01 13:38:10 compute-0 sudo[60267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:38:10 compute-0 python3.9[60269]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:38:10 compute-0 sudo[60267]: pam_unix(sudo:session): session closed for user root
Oct 01 13:38:11 compute-0 sudo[60392]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qjdfxllawvouansxrqdchwilgaxupgih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325890.0070467-450-175779167643094/AnsiballZ_copy.py'
Oct 01 13:38:11 compute-0 sudo[60392]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:38:11 compute-0 python3.9[60394]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759325890.0070467-450-175779167643094/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:38:11 compute-0 sudo[60392]: pam_unix(sudo:session): session closed for user root
Oct 01 13:38:12 compute-0 sudo[60546]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twmenmemwdiiwdepwlhsxujebnogsbiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325892.0551014-492-60614861386506/AnsiballZ_lineinfile.py'
Oct 01 13:38:12 compute-0 sudo[60546]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:38:12 compute-0 python3.9[60548]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:38:12 compute-0 sudo[60546]: pam_unix(sudo:session): session closed for user root
Oct 01 13:38:14 compute-0 sudo[60700]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmxwhkqbxivyyfijnasqopeoahlhtvaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325893.6415389-522-124645715333075/AnsiballZ_setup.py'
Oct 01 13:38:14 compute-0 sudo[60700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:38:14 compute-0 python3.9[60702]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 01 13:38:14 compute-0 sudo[60700]: pam_unix(sudo:session): session closed for user root
Oct 01 13:38:15 compute-0 sudo[60784]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brfczufymifaierzmgpmebjsyxkxhdit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325893.6415389-522-124645715333075/AnsiballZ_systemd.py'
Oct 01 13:38:15 compute-0 sudo[60784]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:38:15 compute-0 python3.9[60786]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 01 13:38:15 compute-0 sudo[60784]: pam_unix(sudo:session): session closed for user root
Oct 01 13:38:16 compute-0 sudo[60938]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzrthdeuddsqwletoymujcxdxwkzglsp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325896.167362-554-15046169200124/AnsiballZ_setup.py'
Oct 01 13:38:16 compute-0 sudo[60938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:38:16 compute-0 python3.9[60940]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 01 13:38:16 compute-0 sudo[60938]: pam_unix(sudo:session): session closed for user root
Oct 01 13:38:17 compute-0 sudo[61022]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kguydpwojckbdcftwyuvovobrwetijfc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325896.167362-554-15046169200124/AnsiballZ_systemd.py'
Oct 01 13:38:17 compute-0 sudo[61022]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:38:17 compute-0 python3.9[61024]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 01 13:38:17 compute-0 chronyd[793]: chronyd exiting
Oct 01 13:38:17 compute-0 systemd[1]: Stopping NTP client/server...
Oct 01 13:38:17 compute-0 systemd[1]: chronyd.service: Deactivated successfully.
Oct 01 13:38:17 compute-0 systemd[1]: Stopped NTP client/server.
Oct 01 13:38:17 compute-0 systemd[1]: Starting NTP client/server...
Oct 01 13:38:17 compute-0 chronyd[61033]: chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Oct 01 13:38:17 compute-0 chronyd[61033]: Frequency -27.004 +/- 0.479 ppm read from /var/lib/chrony/drift
Oct 01 13:38:17 compute-0 chronyd[61033]: Loaded seccomp filter (level 2)
Oct 01 13:38:17 compute-0 systemd[1]: Started NTP client/server.
Oct 01 13:38:17 compute-0 sudo[61022]: pam_unix(sudo:session): session closed for user root
Oct 01 13:38:18 compute-0 sshd-session[56328]: Connection closed by 192.168.122.30 port 38220
Oct 01 13:38:18 compute-0 sshd-session[56325]: pam_unix(sshd:session): session closed for user zuul
Oct 01 13:38:18 compute-0 systemd-logind[791]: Session 14 logged out. Waiting for processes to exit.
Oct 01 13:38:18 compute-0 systemd[1]: session-14.scope: Deactivated successfully.
Oct 01 13:38:18 compute-0 systemd[1]: session-14.scope: Consumed 28.384s CPU time.
Oct 01 13:38:18 compute-0 systemd-logind[791]: Removed session 14.
Oct 01 13:38:25 compute-0 sshd-session[61059]: Accepted publickey for zuul from 192.168.122.30 port 59374 ssh2: ECDSA SHA256:G/wBH4NemtaB5A4Xrsc6R+GZmi6HC8VbviS/FKhdd8M
Oct 01 13:38:25 compute-0 systemd-logind[791]: New session 15 of user zuul.
Oct 01 13:38:25 compute-0 systemd[1]: Started Session 15 of User zuul.
Oct 01 13:38:25 compute-0 sshd-session[61059]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 01 13:38:26 compute-0 python3.9[61212]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 01 13:38:27 compute-0 sudo[61366]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsshdiosoihwcmkofmczepvykxhufttr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325907.2453358-46-43422190339943/AnsiballZ_file.py'
Oct 01 13:38:27 compute-0 sudo[61366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:38:27 compute-0 python3.9[61368]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:38:28 compute-0 sudo[61366]: pam_unix(sudo:session): session closed for user root
Oct 01 13:38:28 compute-0 sudo[61541]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-merxnwalyzhoflvndhgninbxfwkrjfwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325908.218997-62-209056476285488/AnsiballZ_stat.py'
Oct 01 13:38:28 compute-0 sudo[61541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:38:29 compute-0 python3.9[61543]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:38:29 compute-0 sudo[61541]: pam_unix(sudo:session): session closed for user root
Oct 01 13:38:29 compute-0 sudo[61619]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsagiskdrwutekapxljcdmmayufwwhgm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325908.218997-62-209056476285488/AnsiballZ_file.py'
Oct 01 13:38:29 compute-0 sudo[61619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:38:29 compute-0 python3.9[61621]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.rwzhnxs2 recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:38:29 compute-0 sudo[61619]: pam_unix(sudo:session): session closed for user root
Oct 01 13:38:30 compute-0 sudo[61771]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chxomqcczpyikjbadzshlnbnpkfpdicf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325910.0391076-102-273069673927663/AnsiballZ_stat.py'
Oct 01 13:38:30 compute-0 sudo[61771]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:38:30 compute-0 python3.9[61773]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:38:30 compute-0 sudo[61771]: pam_unix(sudo:session): session closed for user root
Oct 01 13:38:31 compute-0 sudo[61894]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijyvjanhymxqdifanfgupsqhkupnlyko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325910.0391076-102-273069673927663/AnsiballZ_copy.py'
Oct 01 13:38:31 compute-0 sudo[61894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:38:31 compute-0 python3.9[61896]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759325910.0391076-102-273069673927663/.source _original_basename=.k46eoqfw follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:38:31 compute-0 sudo[61894]: pam_unix(sudo:session): session closed for user root
Oct 01 13:38:31 compute-0 sudo[62046]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mehvkpnyuosuwimaomwvzoyjapeneayx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325911.6786478-134-109662771493729/AnsiballZ_file.py'
Oct 01 13:38:31 compute-0 sudo[62046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:38:32 compute-0 python3.9[62048]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:38:32 compute-0 sudo[62046]: pam_unix(sudo:session): session closed for user root
Oct 01 13:38:32 compute-0 sudo[62198]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwpjywcwxnrfvhbnfvxkoejtdmgfknqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325912.3851902-150-77759677142050/AnsiballZ_stat.py'
Oct 01 13:38:32 compute-0 sudo[62198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:38:32 compute-0 python3.9[62200]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:38:32 compute-0 sudo[62198]: pam_unix(sudo:session): session closed for user root
Oct 01 13:38:33 compute-0 sudo[62321]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hciptdxdxiicypeukrmzaiahdiykmadm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325912.3851902-150-77759677142050/AnsiballZ_copy.py'
Oct 01 13:38:33 compute-0 sudo[62321]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:38:33 compute-0 python3.9[62323]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759325912.3851902-150-77759677142050/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:38:33 compute-0 sudo[62321]: pam_unix(sudo:session): session closed for user root
Oct 01 13:38:34 compute-0 sudo[62473]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjciciplozeysvofpkvvilrchuuptmpj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325913.748907-150-99792450127556/AnsiballZ_stat.py'
Oct 01 13:38:34 compute-0 sudo[62473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:38:34 compute-0 python3.9[62475]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:38:34 compute-0 sudo[62473]: pam_unix(sudo:session): session closed for user root
Oct 01 13:38:34 compute-0 sudo[62596]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xztqevamqxyerlbzdfwzrzkrdqltzwxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325913.748907-150-99792450127556/AnsiballZ_copy.py'
Oct 01 13:38:34 compute-0 sudo[62596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:38:35 compute-0 python3.9[62598]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759325913.748907-150-99792450127556/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:38:35 compute-0 sudo[62596]: pam_unix(sudo:session): session closed for user root
Oct 01 13:38:35 compute-0 sudo[62748]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vptlwdqvklsgovlpktcaakxjzirfnlup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325915.3201766-208-273580164463448/AnsiballZ_file.py'
Oct 01 13:38:35 compute-0 sudo[62748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:38:36 compute-0 python3.9[62750]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:38:36 compute-0 sudo[62748]: pam_unix(sudo:session): session closed for user root
Oct 01 13:38:36 compute-0 sudo[62900]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eeegwoktjomivnnmblqwqnupqsbqidxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325916.2242136-224-125313373391114/AnsiballZ_stat.py'
Oct 01 13:38:36 compute-0 sudo[62900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:38:36 compute-0 python3.9[62902]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:38:36 compute-0 sudo[62900]: pam_unix(sudo:session): session closed for user root
Oct 01 13:38:37 compute-0 sudo[63023]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pagnbnfvinzjwvbozalhkxxcadtyzazh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325916.2242136-224-125313373391114/AnsiballZ_copy.py'
Oct 01 13:38:37 compute-0 sudo[63023]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:38:37 compute-0 python3.9[63025]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759325916.2242136-224-125313373391114/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:38:37 compute-0 sudo[63023]: pam_unix(sudo:session): session closed for user root
Oct 01 13:38:38 compute-0 sudo[63175]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzvzwdepwfococjogbcllcsqucistrmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325917.6807601-254-210973195709131/AnsiballZ_stat.py'
Oct 01 13:38:38 compute-0 sudo[63175]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:38:38 compute-0 python3.9[63177]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:38:38 compute-0 sudo[63175]: pam_unix(sudo:session): session closed for user root
Oct 01 13:38:38 compute-0 sudo[63298]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkcozurfkhtvkwstdnsjziydicvvqvox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325917.6807601-254-210973195709131/AnsiballZ_copy.py'
Oct 01 13:38:38 compute-0 sudo[63298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:38:38 compute-0 python3.9[63300]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759325917.6807601-254-210973195709131/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:38:38 compute-0 sudo[63298]: pam_unix(sudo:session): session closed for user root
Oct 01 13:38:39 compute-0 sudo[63450]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-znzqekvpuurkskvwqelxtqqtuymfvikg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325919.2056093-284-185684093397936/AnsiballZ_systemd.py'
Oct 01 13:38:39 compute-0 sudo[63450]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:38:40 compute-0 python3.9[63452]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 01 13:38:40 compute-0 systemd[1]: Reloading.
Oct 01 13:38:40 compute-0 systemd-rc-local-generator[63475]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:38:40 compute-0 systemd-sysv-generator[63482]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:38:40 compute-0 systemd[1]: Reloading.
Oct 01 13:38:40 compute-0 systemd-sysv-generator[63520]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:38:40 compute-0 systemd-rc-local-generator[63516]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:38:40 compute-0 systemd[1]: Starting EDPM Container Shutdown...
Oct 01 13:38:40 compute-0 systemd[1]: Finished EDPM Container Shutdown.
Oct 01 13:38:40 compute-0 sudo[63450]: pam_unix(sudo:session): session closed for user root
Oct 01 13:38:41 compute-0 sudo[63678]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebmiaegzcfmghhrqgpbkqaueqwdkvsfb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325920.8727486-300-221249143838555/AnsiballZ_stat.py'
Oct 01 13:38:41 compute-0 sudo[63678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:38:41 compute-0 python3.9[63680]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:38:41 compute-0 sudo[63678]: pam_unix(sudo:session): session closed for user root
Oct 01 13:38:41 compute-0 sudo[63801]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qscsvvkdziwaubjusqsglwhqvnazswbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325920.8727486-300-221249143838555/AnsiballZ_copy.py'
Oct 01 13:38:41 compute-0 sudo[63801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:38:41 compute-0 python3.9[63803]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759325920.8727486-300-221249143838555/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:38:41 compute-0 sudo[63801]: pam_unix(sudo:session): session closed for user root
Oct 01 13:38:42 compute-0 sudo[63953]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbnejrpkqmquqsxtxdezdpmjwdsqforj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325922.200029-330-99236271831277/AnsiballZ_stat.py'
Oct 01 13:38:42 compute-0 sudo[63953]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:38:42 compute-0 python3.9[63955]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:38:42 compute-0 sudo[63953]: pam_unix(sudo:session): session closed for user root
Oct 01 13:38:43 compute-0 sudo[64076]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eguqzaseofwgyvjcwujrxafwuqwbdcci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325922.200029-330-99236271831277/AnsiballZ_copy.py'
Oct 01 13:38:43 compute-0 sudo[64076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:38:43 compute-0 python3.9[64078]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759325922.200029-330-99236271831277/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:38:43 compute-0 sudo[64076]: pam_unix(sudo:session): session closed for user root
Oct 01 13:38:43 compute-0 sudo[64228]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aesmunlnphlkaczaqcopepnziorypubz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325923.5750532-360-48479611443016/AnsiballZ_systemd.py'
Oct 01 13:38:43 compute-0 sudo[64228]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:38:44 compute-0 python3.9[64230]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 01 13:38:44 compute-0 systemd[1]: Reloading.
Oct 01 13:38:44 compute-0 systemd-sysv-generator[64257]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:38:44 compute-0 systemd-rc-local-generator[64254]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:38:44 compute-0 systemd[1]: Reloading.
Oct 01 13:38:44 compute-0 systemd-sysv-generator[64299]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:38:44 compute-0 systemd-rc-local-generator[64295]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:38:44 compute-0 systemd[1]: Starting Create netns directory...
Oct 01 13:38:44 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct 01 13:38:44 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct 01 13:38:44 compute-0 systemd[1]: Finished Create netns directory.
Oct 01 13:38:44 compute-0 sudo[64228]: pam_unix(sudo:session): session closed for user root
Oct 01 13:38:45 compute-0 python3.9[64455]: ansible-ansible.builtin.service_facts Invoked
Oct 01 13:38:45 compute-0 network[64472]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 01 13:38:45 compute-0 network[64473]: 'network-scripts' will be removed from distribution in near future.
Oct 01 13:38:45 compute-0 network[64474]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 01 13:38:50 compute-0 sudo[64736]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkyavenoipglbfaqedxdyynalfmzolrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325930.441778-392-213619018002226/AnsiballZ_systemd.py'
Oct 01 13:38:50 compute-0 sudo[64736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:38:51 compute-0 python3.9[64738]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 01 13:38:51 compute-0 systemd[1]: Reloading.
Oct 01 13:38:51 compute-0 systemd-sysv-generator[64774]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:38:51 compute-0 systemd-rc-local-generator[64769]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:38:51 compute-0 systemd[1]: Stopping IPv4 firewall with iptables...
Oct 01 13:38:51 compute-0 iptables.init[64779]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Oct 01 13:38:51 compute-0 iptables.init[64779]: iptables: Flushing firewall rules: [  OK  ]
Oct 01 13:38:51 compute-0 systemd[1]: iptables.service: Deactivated successfully.
Oct 01 13:38:51 compute-0 systemd[1]: Stopped IPv4 firewall with iptables.
Oct 01 13:38:51 compute-0 sudo[64736]: pam_unix(sudo:session): session closed for user root
Oct 01 13:38:52 compute-0 sudo[64973]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itpbyppylxtzcaehovyusgwyeayyswmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325931.945835-392-14282817167235/AnsiballZ_systemd.py'
Oct 01 13:38:52 compute-0 sudo[64973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:38:52 compute-0 python3.9[64975]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 01 13:38:52 compute-0 sudo[64973]: pam_unix(sudo:session): session closed for user root
Oct 01 13:38:53 compute-0 sudo[65127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iutyzkvgcmkprvlzcoppzcpxdispjzcz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325933.297548-424-67571318549701/AnsiballZ_systemd.py'
Oct 01 13:38:53 compute-0 sudo[65127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:38:53 compute-0 python3.9[65129]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 01 13:38:53 compute-0 systemd[1]: Reloading.
Oct 01 13:38:54 compute-0 systemd-rc-local-generator[65158]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:38:54 compute-0 systemd-sysv-generator[65162]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:38:54 compute-0 systemd[1]: Starting Netfilter Tables...
Oct 01 13:38:54 compute-0 systemd[1]: Finished Netfilter Tables.
Oct 01 13:38:54 compute-0 sudo[65127]: pam_unix(sudo:session): session closed for user root
Oct 01 13:38:56 compute-0 sudo[65319]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zgewxxonigzsinuokhyelzquxoppjkxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325935.6585453-440-45504684721619/AnsiballZ_command.py'
Oct 01 13:38:56 compute-0 sudo[65319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:38:56 compute-0 python3.9[65321]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:38:56 compute-0 sudo[65319]: pam_unix(sudo:session): session closed for user root
Oct 01 13:38:57 compute-0 sudo[65472]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qiqzaunjmfeknycofnaxuhnkcljitxij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325936.9021204-468-113062363366837/AnsiballZ_stat.py'
Oct 01 13:38:57 compute-0 sudo[65472]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:38:57 compute-0 python3.9[65474]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:38:57 compute-0 sudo[65472]: pam_unix(sudo:session): session closed for user root
Oct 01 13:38:57 compute-0 sudo[65597]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ergexidkuxvvhjrfxojvemyjetjwqzbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325936.9021204-468-113062363366837/AnsiballZ_copy.py'
Oct 01 13:38:57 compute-0 sudo[65597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:38:57 compute-0 python3.9[65599]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759325936.9021204-468-113062363366837/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=4729b6ffc5b555fa142bf0b6e6dc15609cb89a22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:38:58 compute-0 sudo[65597]: pam_unix(sudo:session): session closed for user root
Oct 01 13:38:58 compute-0 python3.9[65750]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 01 13:38:58 compute-0 polkitd[6977]: Registered Authentication Agent for unix-process:65752:229948 (system bus name :1.555 [/usr/bin/pkttyagent --notify-fd 5 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8)
Oct 01 13:39:23 compute-0 polkit-agent-helper-1[65764]: pam_unix(polkit-1:auth): conversation failed
Oct 01 13:39:23 compute-0 polkit-agent-helper-1[65764]: pam_unix(polkit-1:auth): auth could not identify password for [root]
Oct 01 13:39:23 compute-0 polkitd[6977]: Unregistered Authentication Agent for unix-process:65752:229948 (system bus name :1.555, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8) (disconnected from bus)
Oct 01 13:39:23 compute-0 polkitd[6977]: Operator of unix-process:65752:229948 FAILED to authenticate to gain authorization for action org.freedesktop.systemd1.manage-units for system-bus-name::1.554 [<unknown>] (owned by unix-user:zuul)
Oct 01 13:39:24 compute-0 sshd-session[61062]: Connection closed by 192.168.122.30 port 59374
Oct 01 13:39:24 compute-0 sshd-session[61059]: pam_unix(sshd:session): session closed for user zuul
Oct 01 13:39:24 compute-0 systemd[1]: session-15.scope: Deactivated successfully.
Oct 01 13:39:24 compute-0 systemd[1]: session-15.scope: Consumed 21.853s CPU time.
Oct 01 13:39:24 compute-0 systemd-logind[791]: Session 15 logged out. Waiting for processes to exit.
Oct 01 13:39:24 compute-0 systemd-logind[791]: Removed session 15.
Oct 01 13:39:37 compute-0 sshd-session[65790]: Accepted publickey for zuul from 192.168.122.30 port 55558 ssh2: ECDSA SHA256:G/wBH4NemtaB5A4Xrsc6R+GZmi6HC8VbviS/FKhdd8M
Oct 01 13:39:37 compute-0 systemd-logind[791]: New session 16 of user zuul.
Oct 01 13:39:37 compute-0 systemd[1]: Started Session 16 of User zuul.
Oct 01 13:39:37 compute-0 sshd-session[65790]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 01 13:39:38 compute-0 python3.9[65943]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 01 13:39:39 compute-0 sudo[66097]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdlvwooakwvsvbtxiibeyzsrevenezzv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325978.7096567-46-5613098106416/AnsiballZ_file.py'
Oct 01 13:39:39 compute-0 sudo[66097]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:39:39 compute-0 python3.9[66099]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:39:39 compute-0 sudo[66097]: pam_unix(sudo:session): session closed for user root
Oct 01 13:39:40 compute-0 sudo[66272]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbnabpcabxueignuhvmmojuqtugwylfc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325979.7082691-62-92313585160628/AnsiballZ_stat.py'
Oct 01 13:39:40 compute-0 sudo[66272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:39:40 compute-0 python3.9[66274]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:39:40 compute-0 sudo[66272]: pam_unix(sudo:session): session closed for user root
Oct 01 13:39:40 compute-0 sudo[66350]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agltmdbfqlnnulqlyetjdmjbhfogbgny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325979.7082691-62-92313585160628/AnsiballZ_file.py'
Oct 01 13:39:40 compute-0 sudo[66350]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:39:41 compute-0 python3.9[66352]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.433c2uqa recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:39:41 compute-0 sudo[66350]: pam_unix(sudo:session): session closed for user root
Oct 01 13:39:41 compute-0 sudo[66502]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ozexmvectesyjmrcoupxxpkjunegjojn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325981.4675786-102-124164699261531/AnsiballZ_stat.py'
Oct 01 13:39:41 compute-0 sudo[66502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:39:42 compute-0 python3.9[66504]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:39:42 compute-0 sudo[66502]: pam_unix(sudo:session): session closed for user root
Oct 01 13:39:42 compute-0 sudo[66580]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qezkuuggpdryqwlppljcollfehnyserv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325981.4675786-102-124164699261531/AnsiballZ_file.py'
Oct 01 13:39:42 compute-0 sudo[66580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:39:42 compute-0 python3.9[66582]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/podman_drop_in _original_basename=.ecpa_izm recurse=False state=file path=/etc/sysconfig/podman_drop_in force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:39:42 compute-0 sudo[66580]: pam_unix(sudo:session): session closed for user root
Oct 01 13:39:43 compute-0 sudo[66732]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fimvusvvdcpkjcfkfeigfgbdgfdbikwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325982.7964163-128-150843755107559/AnsiballZ_file.py'
Oct 01 13:39:43 compute-0 sudo[66732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:39:43 compute-0 python3.9[66734]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:39:43 compute-0 sudo[66732]: pam_unix(sudo:session): session closed for user root
Oct 01 13:39:43 compute-0 sudo[66884]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cnmteqcqgegzgsqrtzqxtwwefweoqfhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325983.5860171-144-142709381210859/AnsiballZ_stat.py'
Oct 01 13:39:43 compute-0 sudo[66884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:39:44 compute-0 python3.9[66886]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:39:44 compute-0 sudo[66884]: pam_unix(sudo:session): session closed for user root
Oct 01 13:39:44 compute-0 sudo[66962]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wizggfgavkzvwziairofcwpkvijgfmgy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325983.5860171-144-142709381210859/AnsiballZ_file.py'
Oct 01 13:39:44 compute-0 sudo[66962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:39:44 compute-0 python3.9[66964]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:39:44 compute-0 sudo[66962]: pam_unix(sudo:session): session closed for user root
Oct 01 13:39:45 compute-0 sudo[67114]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzzsozynxodloafworhmdvxnwskijdvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325984.7583833-144-206404505693532/AnsiballZ_stat.py'
Oct 01 13:39:45 compute-0 sudo[67114]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:39:45 compute-0 python3.9[67116]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:39:45 compute-0 sudo[67114]: pam_unix(sudo:session): session closed for user root
Oct 01 13:39:45 compute-0 sudo[67192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cnxhecwrfwotofbhrtpoeokjggguwufz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325984.7583833-144-206404505693532/AnsiballZ_file.py'
Oct 01 13:39:45 compute-0 sudo[67192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:39:45 compute-0 python3.9[67194]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:39:45 compute-0 sudo[67192]: pam_unix(sudo:session): session closed for user root
Oct 01 13:39:47 compute-0 sudo[67346]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wiarneywtbnsmuqbcjkmnwfidjtpjwsu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325987.0068102-190-98275256459834/AnsiballZ_file.py'
Oct 01 13:39:47 compute-0 sudo[67346]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:39:47 compute-0 python3.9[67348]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:39:47 compute-0 sudo[67346]: pam_unix(sudo:session): session closed for user root
Oct 01 13:39:47 compute-0 sshd-session[67219]: Received disconnect from 80.94.93.176 port 18868:11:  [preauth]
Oct 01 13:39:47 compute-0 sshd-session[67219]: Disconnected from authenticating user root 80.94.93.176 port 18868 [preauth]
Oct 01 13:39:48 compute-0 sudo[67498]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irhvzpekhybzjkpyclckzyrmfzqrijqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325987.8243623-206-17797554108707/AnsiballZ_stat.py'
Oct 01 13:39:48 compute-0 sudo[67498]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:39:48 compute-0 python3.9[67500]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:39:48 compute-0 sudo[67498]: pam_unix(sudo:session): session closed for user root
Oct 01 13:39:48 compute-0 sudo[67576]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hralwziyqsqlnojdoxifdsttevroztej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325987.8243623-206-17797554108707/AnsiballZ_file.py'
Oct 01 13:39:48 compute-0 sudo[67576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:39:48 compute-0 python3.9[67578]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:39:48 compute-0 sudo[67576]: pam_unix(sudo:session): session closed for user root
Oct 01 13:39:49 compute-0 sudo[67728]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-smbdxknegnizhtqmegnraiemvvceiaiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325989.2106903-230-36145449547821/AnsiballZ_stat.py'
Oct 01 13:39:49 compute-0 sudo[67728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:39:49 compute-0 python3.9[67730]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:39:49 compute-0 sudo[67728]: pam_unix(sudo:session): session closed for user root
Oct 01 13:39:50 compute-0 sudo[67806]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-okiqwbfchjnqbkmkeaxmfnhvklkpjcqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325989.2106903-230-36145449547821/AnsiballZ_file.py'
Oct 01 13:39:50 compute-0 sudo[67806]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:39:50 compute-0 python3.9[67808]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:39:50 compute-0 sudo[67806]: pam_unix(sudo:session): session closed for user root
Oct 01 13:39:51 compute-0 sudo[67958]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwzoafqxvpctbqxlusnaotkhkyjfzlmx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325990.870813-254-45666603582226/AnsiballZ_systemd.py'
Oct 01 13:39:51 compute-0 sudo[67958]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:39:51 compute-0 python3.9[67960]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 01 13:39:51 compute-0 systemd[1]: Reloading.
Oct 01 13:39:52 compute-0 systemd-rc-local-generator[67983]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:39:52 compute-0 systemd-sysv-generator[67986]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:39:52 compute-0 sudo[67958]: pam_unix(sudo:session): session closed for user root
Oct 01 13:39:52 compute-0 sudo[68147]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-srhzdeccfaxgehoazgbdmjckkrycrhic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325992.5218875-270-207103686211834/AnsiballZ_stat.py'
Oct 01 13:39:52 compute-0 sudo[68147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:39:53 compute-0 python3.9[68149]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:39:53 compute-0 sudo[68147]: pam_unix(sudo:session): session closed for user root
Oct 01 13:39:53 compute-0 sudo[68225]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kyadvmimhqxsriktjscxxhlxmfjjzlvp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325992.5218875-270-207103686211834/AnsiballZ_file.py'
Oct 01 13:39:53 compute-0 sudo[68225]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:39:53 compute-0 python3.9[68227]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:39:53 compute-0 sudo[68225]: pam_unix(sudo:session): session closed for user root
Oct 01 13:39:54 compute-0 sudo[68377]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-disfzphrinathsosjknfeduwhoowtamq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325994.1623547-294-251267030732310/AnsiballZ_stat.py'
Oct 01 13:39:54 compute-0 sudo[68377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:39:54 compute-0 python3.9[68379]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:39:54 compute-0 sudo[68377]: pam_unix(sudo:session): session closed for user root
Oct 01 13:39:54 compute-0 sudo[68455]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vetiojrorwrxopcztoekrezmgxgyxbbe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325994.1623547-294-251267030732310/AnsiballZ_file.py'
Oct 01 13:39:54 compute-0 sudo[68455]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:39:55 compute-0 python3.9[68457]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:39:55 compute-0 sudo[68455]: pam_unix(sudo:session): session closed for user root
Oct 01 13:39:56 compute-0 sudo[68607]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-comucajolixxunqnidgojhikpxmrxbfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759325995.8058763-318-94712266677231/AnsiballZ_systemd.py'
Oct 01 13:39:56 compute-0 sudo[68607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:39:56 compute-0 python3.9[68609]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 01 13:39:56 compute-0 systemd[1]: Reloading.
Oct 01 13:39:56 compute-0 systemd-sysv-generator[68640]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:39:56 compute-0 systemd-rc-local-generator[68637]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:39:56 compute-0 systemd[1]: Starting Create netns directory...
Oct 01 13:39:56 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct 01 13:39:56 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct 01 13:39:56 compute-0 systemd[1]: Finished Create netns directory.
Oct 01 13:39:56 compute-0 sudo[68607]: pam_unix(sudo:session): session closed for user root
Oct 01 13:39:58 compute-0 python3.9[68800]: ansible-ansible.builtin.service_facts Invoked
Oct 01 13:39:58 compute-0 network[68817]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 01 13:39:58 compute-0 network[68818]: 'network-scripts' will be removed from distribution in near future.
Oct 01 13:39:58 compute-0 network[68819]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 01 13:40:05 compute-0 sudo[69080]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnkzulafewftrsktraexgmwjghifpxom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326005.1388009-370-218713682122875/AnsiballZ_stat.py'
Oct 01 13:40:05 compute-0 sudo[69080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:40:05 compute-0 python3.9[69082]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:40:05 compute-0 sudo[69080]: pam_unix(sudo:session): session closed for user root
Oct 01 13:40:05 compute-0 sudo[69158]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbwkquifghiylqosasowfflldxcyuawa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326005.1388009-370-218713682122875/AnsiballZ_file.py'
Oct 01 13:40:05 compute-0 sudo[69158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:40:06 compute-0 python3.9[69160]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/etc/ssh/sshd_config _original_basename=sshd_config_block.j2 recurse=False state=file path=/etc/ssh/sshd_config force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:40:06 compute-0 sudo[69158]: pam_unix(sudo:session): session closed for user root
Oct 01 13:40:06 compute-0 sudo[69310]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwwcpytphalbtlrsvxtlqsepyixhwvau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326006.502251-396-65679341418853/AnsiballZ_file.py'
Oct 01 13:40:06 compute-0 sudo[69310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:40:07 compute-0 python3.9[69312]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:40:07 compute-0 sudo[69310]: pam_unix(sudo:session): session closed for user root
Oct 01 13:40:07 compute-0 sudo[69462]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvaohuvatpqejqhiuijjnfsivbqjjyeu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326007.4047408-412-142258138152406/AnsiballZ_stat.py'
Oct 01 13:40:07 compute-0 sudo[69462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:40:08 compute-0 python3.9[69464]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:40:08 compute-0 sudo[69462]: pam_unix(sudo:session): session closed for user root
Oct 01 13:40:08 compute-0 sudo[69585]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yviuxsshrocagfmjfcgjjvraayhvzsyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326007.4047408-412-142258138152406/AnsiballZ_copy.py'
Oct 01 13:40:08 compute-0 sudo[69585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:40:08 compute-0 python3.9[69587]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759326007.4047408-412-142258138152406/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:40:08 compute-0 sudo[69585]: pam_unix(sudo:session): session closed for user root
Oct 01 13:40:10 compute-0 sudo[69737]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqllgrmoxafowsrcbgsqijbjpkdptdke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326009.2263734-448-123921601491493/AnsiballZ_timezone.py'
Oct 01 13:40:10 compute-0 sudo[69737]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:40:10 compute-0 python3.9[69739]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Oct 01 13:40:10 compute-0 systemd[1]: Starting Time & Date Service...
Oct 01 13:40:10 compute-0 systemd[1]: Started Time & Date Service.
Oct 01 13:40:10 compute-0 sudo[69737]: pam_unix(sudo:session): session closed for user root
Oct 01 13:40:11 compute-0 sudo[69893]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxoxijdltbatvkzgckdhdkmwlnivgvim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326010.7834775-466-233963636225981/AnsiballZ_file.py'
Oct 01 13:40:11 compute-0 sudo[69893]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:40:11 compute-0 python3.9[69895]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:40:11 compute-0 sudo[69893]: pam_unix(sudo:session): session closed for user root
Oct 01 13:40:11 compute-0 sudo[70045]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrqewbzekmtreudfjawnutxvlaxftfdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326011.6240997-482-25680839986594/AnsiballZ_stat.py'
Oct 01 13:40:11 compute-0 sudo[70045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:40:12 compute-0 python3.9[70047]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:40:12 compute-0 sudo[70045]: pam_unix(sudo:session): session closed for user root
Oct 01 13:40:12 compute-0 sudo[70168]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kouxjdnkqnorbudbcbzzmvcjtwxkdclg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326011.6240997-482-25680839986594/AnsiballZ_copy.py'
Oct 01 13:40:12 compute-0 sudo[70168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:40:12 compute-0 python3.9[70170]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759326011.6240997-482-25680839986594/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:40:12 compute-0 sudo[70168]: pam_unix(sudo:session): session closed for user root
Oct 01 13:40:13 compute-0 sudo[70320]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whpxccnfhpekbhvxrszpatzakirzhnot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326012.9939885-512-277972547640219/AnsiballZ_stat.py'
Oct 01 13:40:13 compute-0 sudo[70320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:40:13 compute-0 python3.9[70322]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:40:13 compute-0 sudo[70320]: pam_unix(sudo:session): session closed for user root
Oct 01 13:40:13 compute-0 sudo[70443]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxglodcwsnsfysdyomzmgsukowtxmlli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326012.9939885-512-277972547640219/AnsiballZ_copy.py'
Oct 01 13:40:13 compute-0 sudo[70443]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:40:14 compute-0 python3.9[70445]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759326012.9939885-512-277972547640219/.source.yaml _original_basename=.wdgjlwde follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:40:14 compute-0 sudo[70443]: pam_unix(sudo:session): session closed for user root
Oct 01 13:40:14 compute-0 sudo[70595]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxzhcajqigwgnfjsbaejaiyhkulexyxz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326014.3192828-542-201782469150692/AnsiballZ_stat.py'
Oct 01 13:40:14 compute-0 sudo[70595]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:40:15 compute-0 python3.9[70597]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:40:15 compute-0 sudo[70595]: pam_unix(sudo:session): session closed for user root
Oct 01 13:40:15 compute-0 sudo[70718]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpzrrzblnkuzwqjbbzdathctkhavsbfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326014.3192828-542-201782469150692/AnsiballZ_copy.py'
Oct 01 13:40:15 compute-0 sudo[70718]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:40:15 compute-0 python3.9[70720]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759326014.3192828-542-201782469150692/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:40:15 compute-0 sudo[70718]: pam_unix(sudo:session): session closed for user root
Oct 01 13:40:16 compute-0 sudo[70870]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-psmflchqignxowqcajqmucasshabbkey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326015.886014-572-47734682678195/AnsiballZ_command.py'
Oct 01 13:40:16 compute-0 sudo[70870]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:40:16 compute-0 python3.9[70872]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:40:16 compute-0 sudo[70870]: pam_unix(sudo:session): session closed for user root
Oct 01 13:40:17 compute-0 sudo[71023]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kyubvcgqpdorrulawbomilpghjmxegfp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326016.8747423-588-268268318311480/AnsiballZ_command.py'
Oct 01 13:40:17 compute-0 sudo[71023]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:40:17 compute-0 python3.9[71025]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:40:17 compute-0 sudo[71023]: pam_unix(sudo:session): session closed for user root
Oct 01 13:40:18 compute-0 sudo[71176]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlpnxrocfepftrregoeytldonjblutwe ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759326017.5812533-604-46857717256516/AnsiballZ_edpm_nftables_from_files.py'
Oct 01 13:40:18 compute-0 sudo[71176]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:40:18 compute-0 python3[71178]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Oct 01 13:40:18 compute-0 sudo[71176]: pam_unix(sudo:session): session closed for user root
Oct 01 13:40:18 compute-0 sudo[71328]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwmixascmacfgrajgxeroxicpriqnnhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326018.5079448-620-54988718414483/AnsiballZ_stat.py'
Oct 01 13:40:18 compute-0 sudo[71328]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:40:19 compute-0 python3.9[71330]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:40:19 compute-0 sudo[71328]: pam_unix(sudo:session): session closed for user root
Oct 01 13:40:19 compute-0 sudo[71451]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aywclxplvufjjonjarvfltktfexnsowl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326018.5079448-620-54988718414483/AnsiballZ_copy.py'
Oct 01 13:40:19 compute-0 sudo[71451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:40:19 compute-0 python3.9[71453]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759326018.5079448-620-54988718414483/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:40:19 compute-0 sudo[71451]: pam_unix(sudo:session): session closed for user root
Oct 01 13:40:20 compute-0 sudo[71603]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-riqjdyqpbtcguapbwmtcznvxvmiegzpa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326020.0954614-650-212697713050428/AnsiballZ_stat.py'
Oct 01 13:40:20 compute-0 sudo[71603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:40:20 compute-0 python3.9[71605]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:40:20 compute-0 sudo[71603]: pam_unix(sudo:session): session closed for user root
Oct 01 13:40:21 compute-0 sudo[71726]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qsqsyyaaodgvtsbjcrlcicqeoewugtir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326020.0954614-650-212697713050428/AnsiballZ_copy.py'
Oct 01 13:40:21 compute-0 sudo[71726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:40:21 compute-0 python3.9[71728]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759326020.0954614-650-212697713050428/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:40:21 compute-0 sudo[71726]: pam_unix(sudo:session): session closed for user root
Oct 01 13:40:21 compute-0 sudo[71878]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bcyjgxdfembqbcxuklpfowpxgzyobzyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326021.5908267-680-73434577954121/AnsiballZ_stat.py'
Oct 01 13:40:21 compute-0 sudo[71878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:40:22 compute-0 python3.9[71880]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:40:22 compute-0 sudo[71878]: pam_unix(sudo:session): session closed for user root
Oct 01 13:40:22 compute-0 sudo[72001]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqyzzebuxzlrureliaayxhuocvkatzey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326021.5908267-680-73434577954121/AnsiballZ_copy.py'
Oct 01 13:40:22 compute-0 sudo[72001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:40:22 compute-0 python3.9[72003]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759326021.5908267-680-73434577954121/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:40:22 compute-0 sudo[72001]: pam_unix(sudo:session): session closed for user root
Oct 01 13:40:23 compute-0 sudo[72153]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-roepdmibgzjtnnczsouxnsjnuqpkqeid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326022.9481773-710-120731703402904/AnsiballZ_stat.py'
Oct 01 13:40:23 compute-0 sudo[72153]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:40:23 compute-0 python3.9[72155]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:40:23 compute-0 sudo[72153]: pam_unix(sudo:session): session closed for user root
Oct 01 13:40:24 compute-0 sudo[72276]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hanfqdysodwxlwzkujdpeznvviirswwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326022.9481773-710-120731703402904/AnsiballZ_copy.py'
Oct 01 13:40:24 compute-0 sudo[72276]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:40:24 compute-0 python3.9[72278]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759326022.9481773-710-120731703402904/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:40:24 compute-0 sudo[72276]: pam_unix(sudo:session): session closed for user root
Oct 01 13:40:24 compute-0 sudo[72428]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qtghgcmyepgvgtalzngqhanrbmgaldeq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326024.5471096-740-169151207012222/AnsiballZ_stat.py'
Oct 01 13:40:24 compute-0 sudo[72428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:40:25 compute-0 python3.9[72430]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:40:25 compute-0 sudo[72428]: pam_unix(sudo:session): session closed for user root
Oct 01 13:40:25 compute-0 sudo[72551]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pchngjqwtycxwmnaypfruvxuzcetxpwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326024.5471096-740-169151207012222/AnsiballZ_copy.py'
Oct 01 13:40:25 compute-0 sudo[72551]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:40:25 compute-0 python3.9[72553]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759326024.5471096-740-169151207012222/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:40:25 compute-0 sudo[72551]: pam_unix(sudo:session): session closed for user root
Oct 01 13:40:26 compute-0 chronyd[61033]: Selected source 54.39.196.172 (pool.ntp.org)
Oct 01 13:40:26 compute-0 sudo[72703]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbgyxeeltkplvvsmvficgvzqmjiqylry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326025.9998116-770-183263967425510/AnsiballZ_file.py'
Oct 01 13:40:26 compute-0 sudo[72703]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:40:26 compute-0 python3.9[72705]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:40:26 compute-0 sudo[72703]: pam_unix(sudo:session): session closed for user root
Oct 01 13:40:27 compute-0 sudo[72855]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uarxfrcfqxgwchshziaxeceeeqwdttfc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326026.7799072-786-188541257940098/AnsiballZ_command.py'
Oct 01 13:40:27 compute-0 sudo[72855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:40:27 compute-0 python3.9[72857]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:40:27 compute-0 sudo[72855]: pam_unix(sudo:session): session closed for user root
Oct 01 13:40:28 compute-0 sudo[73014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlikpgauqwmzjyqfxmkytytfsjyrhrim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326027.7048087-802-203458094525378/AnsiballZ_blockinfile.py'
Oct 01 13:40:28 compute-0 sudo[73014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:40:28 compute-0 python3.9[73016]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:40:28 compute-0 sudo[73014]: pam_unix(sudo:session): session closed for user root
Oct 01 13:40:29 compute-0 sudo[73167]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxpicbiltpjwhgswyoamyvwwoiggzcrt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326028.766994-820-239997083184664/AnsiballZ_file.py'
Oct 01 13:40:29 compute-0 sudo[73167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:40:29 compute-0 python3.9[73169]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:40:29 compute-0 sudo[73167]: pam_unix(sudo:session): session closed for user root
Oct 01 13:40:29 compute-0 sudo[73319]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhgrfrxdbcpddxwqenmfvbjagealxzmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326029.6128461-820-123412753532128/AnsiballZ_file.py'
Oct 01 13:40:29 compute-0 sudo[73319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:40:30 compute-0 python3.9[73321]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:40:30 compute-0 sudo[73319]: pam_unix(sudo:session): session closed for user root
Oct 01 13:40:30 compute-0 sudo[73471]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhnxlevralyngdbwgcfrznfacdcuhomo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326030.4625607-850-103886332109921/AnsiballZ_mount.py'
Oct 01 13:40:30 compute-0 sudo[73471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:40:31 compute-0 python3.9[73473]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Oct 01 13:40:31 compute-0 sudo[73471]: pam_unix(sudo:session): session closed for user root
Oct 01 13:40:31 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 01 13:40:31 compute-0 sudo[73625]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrkpxaxolpcofrclzedmhnvoectkzwop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326031.4079428-850-81435071105559/AnsiballZ_mount.py'
Oct 01 13:40:31 compute-0 sudo[73625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:40:31 compute-0 python3.9[73627]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Oct 01 13:40:31 compute-0 sudo[73625]: pam_unix(sudo:session): session closed for user root
Oct 01 13:40:32 compute-0 sshd-session[65793]: Connection closed by 192.168.122.30 port 55558
Oct 01 13:40:32 compute-0 sshd-session[65790]: pam_unix(sshd:session): session closed for user zuul
Oct 01 13:40:32 compute-0 systemd[1]: session-16.scope: Deactivated successfully.
Oct 01 13:40:32 compute-0 systemd[1]: session-16.scope: Consumed 35.533s CPU time.
Oct 01 13:40:32 compute-0 systemd-logind[791]: Session 16 logged out. Waiting for processes to exit.
Oct 01 13:40:32 compute-0 systemd-logind[791]: Removed session 16.
Oct 01 13:40:37 compute-0 sshd-session[73653]: Accepted publickey for zuul from 192.168.122.30 port 40262 ssh2: ECDSA SHA256:G/wBH4NemtaB5A4Xrsc6R+GZmi6HC8VbviS/FKhdd8M
Oct 01 13:40:37 compute-0 systemd-logind[791]: New session 17 of user zuul.
Oct 01 13:40:37 compute-0 systemd[1]: Started Session 17 of User zuul.
Oct 01 13:40:37 compute-0 sshd-session[73653]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 01 13:40:38 compute-0 sudo[73806]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjyvdrokplgdbdxjfsbeikuvblclsgcu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326038.080877-17-206071985282021/AnsiballZ_tempfile.py'
Oct 01 13:40:38 compute-0 sudo[73806]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:40:38 compute-0 python3.9[73808]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Oct 01 13:40:38 compute-0 sudo[73806]: pam_unix(sudo:session): session closed for user root
Oct 01 13:40:39 compute-0 sudo[73958]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjxhtnuwfwdksjoucxtosuyiyndzdcme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326039.078269-41-180661583679291/AnsiballZ_stat.py'
Oct 01 13:40:39 compute-0 sudo[73958]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:40:39 compute-0 python3.9[73960]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 01 13:40:39 compute-0 sudo[73958]: pam_unix(sudo:session): session closed for user root
Oct 01 13:40:40 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Oct 01 13:40:40 compute-0 sudo[74112]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vykbfvnamndpbhtqwxyclrpydrshqdyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326040.130396-61-149646458191431/AnsiballZ_setup.py'
Oct 01 13:40:40 compute-0 sudo[74112]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:40:41 compute-0 python3.9[74114]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 01 13:40:41 compute-0 sudo[74112]: pam_unix(sudo:session): session closed for user root
Oct 01 13:40:41 compute-0 sudo[74264]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfxbyrkjbqumnqzplacogdpujfxjjcto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326041.4034693-78-75093189938826/AnsiballZ_blockinfile.py'
Oct 01 13:40:41 compute-0 sudo[74264]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:40:42 compute-0 python3.9[74266]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCtqm93fsIECaRaBh5dYj6PpbaUZ9KcRfsqdWMxfqsmJs6g6QpKlgofUm+uvbYJwqkJs3HixpIPozoeYNjjNVueod/3+44G17FK7NcWiFPlmD3FBMkPOmQvL7iEKIJAEiLVg5T6EEWISZYsVlz/08tz1sFbzSHP4GG/lLh8IgM7z33TWJf5a2hHfQwFrB48BjmrHSUtiDzIC/+hatIQew4c1oZK5P2VMsFoEBzrhgIhYuLzOod1ueG9N51tc+OE24ztXsVrheBQRd4V+v8e6pX5ckS2yYcL3cpfnsj2xVnqLOaIG25D6a7s9J0VBlPWv2VE4G+aoYoQ6YkObRnXXzImqSJ0YTGVW34d4iZWwiAD6/nGM8m2yNIhdarpMIB+C1PUzPWuPUUlBmVV2HvKCrnUux/MC5Y62vI9PaXKodopGKqYKJm5hWSWVgtZFS1Z+d5H7zrq62k+Mv8sWQk7UEbPwBvACZ9R3DLF0gvbRWX8Nd01Y7f0i2SmT4DoHMUceyU=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIJMMf3snwR5/CN7ctk7xi+oso6Kk5xmDZKGMBPRLFOh4
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBM2HLYdNUms4vntNVgl6ayhdFQfOvOlYp56iTbW5P5/ulxrs0Ex7A7jUwxazmeZ9nahLPa00gfYazL/tO+XN09A=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLyBu2zz0RcXC4GquXU9lsiaev8KNI4W2uMFfIlOewNfpnodHyCBmLzcwRkPXemWmkZ3SIz54wdZAnRwgl4FDnJQshy9c+kPsN3R3Bh3cjiAaCV+KurDZv/V3Dz9HJkQHc2+xUQuBd/pRbgufDxyrNqDhoN85GfkKui/zRwQD3fiy+Rhz3eAqTFSleyRRiMrdStBtDPkPg7IoRnPgmyw6RYc33ye4r4oxXeuFgomkkz7++va3wrKnVYx80oKmRL/eP7CAop1CPMr1VERj9GljBZuR1VfIfYY7uf2900C5MAow00W79ii/9oZyqbC1OKPlrIoNX859qdVsemOGgS8o+dPmycG/myPeSLgg9Y0lPb65wnobnZu53Ib8P9LVR7bv0ED27gZHneljzP2j+QV+/VNOUV1uSMvsiyakrlJ+5Rt+clzehQNCkiEIA/+6s4M7BkgpTF6Nu+aL+yZ/MHTQek9gGq3Be8nkUpd4kJZIlZ1oU27Rn/4dV/9hJChVAJhU=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBSj3lk61qDGYtTt5jd/IuF7HYf4mFoSSELJLpuGWjNN
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBiZXjpfPJjty2mPZeYheMD+dL0XzalpWEYGiQAXBKvpVvOCEeQRCsAVteennEWJvKEQz731so7A70yTqza7zQs=
                                             create=True mode=0644 path=/tmp/ansible.3r9mvax3 state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:40:42 compute-0 sudo[74264]: pam_unix(sudo:session): session closed for user root
Oct 01 13:40:42 compute-0 sudo[74416]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sfjpenlnhdukxtzxzwrzljklbwhulpzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326042.4457395-94-91231320971997/AnsiballZ_command.py'
Oct 01 13:40:42 compute-0 sudo[74416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:40:43 compute-0 python3.9[74418]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.3r9mvax3' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:40:43 compute-0 sudo[74416]: pam_unix(sudo:session): session closed for user root
Oct 01 13:40:43 compute-0 sudo[74570]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twjcmdbdackhezpasxmkkkeebckuugvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326043.4382937-110-153218704387791/AnsiballZ_file.py'
Oct 01 13:40:44 compute-0 sudo[74570]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:40:44 compute-0 python3.9[74572]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.3r9mvax3 state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:40:44 compute-0 sudo[74570]: pam_unix(sudo:session): session closed for user root
Oct 01 13:40:44 compute-0 sshd-session[73656]: Connection closed by 192.168.122.30 port 40262
Oct 01 13:40:44 compute-0 sshd-session[73653]: pam_unix(sshd:session): session closed for user zuul
Oct 01 13:40:44 compute-0 systemd[1]: session-17.scope: Deactivated successfully.
Oct 01 13:40:44 compute-0 systemd[1]: session-17.scope: Consumed 4.352s CPU time.
Oct 01 13:40:44 compute-0 systemd-logind[791]: Session 17 logged out. Waiting for processes to exit.
Oct 01 13:40:44 compute-0 systemd-logind[791]: Removed session 17.
Oct 01 13:40:49 compute-0 sshd-session[74597]: Accepted publickey for zuul from 192.168.122.30 port 40110 ssh2: ECDSA SHA256:G/wBH4NemtaB5A4Xrsc6R+GZmi6HC8VbviS/FKhdd8M
Oct 01 13:40:49 compute-0 systemd-logind[791]: New session 18 of user zuul.
Oct 01 13:40:49 compute-0 systemd[1]: Started Session 18 of User zuul.
Oct 01 13:40:49 compute-0 sshd-session[74597]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 01 13:40:51 compute-0 python3.9[74750]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 01 13:40:52 compute-0 sudo[74904]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqwzxawxitfsfesjyjibzwxwvyqqhniu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326051.5050263-44-44516541315387/AnsiballZ_systemd.py'
Oct 01 13:40:52 compute-0 sudo[74904]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:40:52 compute-0 python3.9[74906]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Oct 01 13:40:52 compute-0 sudo[74904]: pam_unix(sudo:session): session closed for user root
Oct 01 13:40:53 compute-0 sudo[75058]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tymhbxkloguynoorvxqtbwizkdrqfbmh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326052.6649425-60-122379940956121/AnsiballZ_systemd.py'
Oct 01 13:40:53 compute-0 sudo[75058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:40:53 compute-0 python3.9[75060]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 01 13:40:53 compute-0 sudo[75058]: pam_unix(sudo:session): session closed for user root
Oct 01 13:40:54 compute-0 sudo[75211]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbazivxauwzlihoqpwuwxicqijzrzpny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326053.7938178-78-158007185938792/AnsiballZ_command.py'
Oct 01 13:40:54 compute-0 sudo[75211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:40:54 compute-0 python3.9[75213]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:40:54 compute-0 sudo[75211]: pam_unix(sudo:session): session closed for user root
Oct 01 13:40:55 compute-0 sudo[75364]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhbxelsmuiucnxiojktmdzduaiclhoui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326054.8060446-94-116887614590225/AnsiballZ_stat.py'
Oct 01 13:40:55 compute-0 sudo[75364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:40:55 compute-0 python3.9[75366]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 01 13:40:55 compute-0 sudo[75364]: pam_unix(sudo:session): session closed for user root
Oct 01 13:40:56 compute-0 sudo[75518]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skmwdhithcvvlmhvrsomzupejgddqgaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326055.8287475-110-87257057520524/AnsiballZ_command.py'
Oct 01 13:40:56 compute-0 sudo[75518]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:40:56 compute-0 python3.9[75520]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:40:56 compute-0 sudo[75518]: pam_unix(sudo:session): session closed for user root
Oct 01 13:40:57 compute-0 sudo[75673]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxrdwmjtjhqnaovezxcpjtphvxbrkelu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326056.6820953-126-49581916377363/AnsiballZ_file.py'
Oct 01 13:40:57 compute-0 sudo[75673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:40:57 compute-0 python3.9[75675]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:40:57 compute-0 sudo[75673]: pam_unix(sudo:session): session closed for user root
Oct 01 13:40:57 compute-0 sshd-session[74600]: Connection closed by 192.168.122.30 port 40110
Oct 01 13:40:57 compute-0 sshd-session[74597]: pam_unix(sshd:session): session closed for user zuul
Oct 01 13:40:57 compute-0 systemd[1]: session-18.scope: Deactivated successfully.
Oct 01 13:40:57 compute-0 systemd[1]: session-18.scope: Consumed 5.212s CPU time.
Oct 01 13:40:57 compute-0 systemd-logind[791]: Session 18 logged out. Waiting for processes to exit.
Oct 01 13:40:57 compute-0 systemd-logind[791]: Removed session 18.
Oct 01 13:41:03 compute-0 sshd-session[75701]: Accepted publickey for zuul from 192.168.122.30 port 48130 ssh2: ECDSA SHA256:G/wBH4NemtaB5A4Xrsc6R+GZmi6HC8VbviS/FKhdd8M
Oct 01 13:41:03 compute-0 systemd-logind[791]: New session 19 of user zuul.
Oct 01 13:41:03 compute-0 systemd[1]: Started Session 19 of User zuul.
Oct 01 13:41:03 compute-0 sshd-session[75701]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 01 13:41:04 compute-0 python3.9[75854]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 01 13:41:05 compute-0 sudo[76008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kasxdhwppbkifciepupxfdzfykinavsb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326064.87674-48-192219157442413/AnsiballZ_setup.py'
Oct 01 13:41:05 compute-0 sudo[76008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:41:05 compute-0 python3.9[76010]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 01 13:41:05 compute-0 sudo[76008]: pam_unix(sudo:session): session closed for user root
Oct 01 13:41:06 compute-0 sudo[76092]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kilkmqcdoslfcvhbrczmqcofhupfexfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326064.87674-48-192219157442413/AnsiballZ_dnf.py'
Oct 01 13:41:06 compute-0 sudo[76092]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:41:06 compute-0 python3.9[76094]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct 01 13:41:07 compute-0 sudo[76092]: pam_unix(sudo:session): session closed for user root
Oct 01 13:41:08 compute-0 python3.9[76245]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:41:09 compute-0 python3.9[76396]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct 01 13:41:10 compute-0 python3.9[76546]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 01 13:41:11 compute-0 python3.9[76696]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 01 13:41:12 compute-0 sshd-session[75704]: Connection closed by 192.168.122.30 port 48130
Oct 01 13:41:12 compute-0 sshd-session[75701]: pam_unix(sshd:session): session closed for user zuul
Oct 01 13:41:12 compute-0 systemd[1]: session-19.scope: Deactivated successfully.
Oct 01 13:41:12 compute-0 systemd[1]: session-19.scope: Consumed 6.623s CPU time.
Oct 01 13:41:12 compute-0 systemd-logind[791]: Session 19 logged out. Waiting for processes to exit.
Oct 01 13:41:12 compute-0 systemd-logind[791]: Removed session 19.
Oct 01 13:41:17 compute-0 sshd-session[76722]: Accepted publickey for zuul from 192.168.122.30 port 32870 ssh2: ECDSA SHA256:G/wBH4NemtaB5A4Xrsc6R+GZmi6HC8VbviS/FKhdd8M
Oct 01 13:41:17 compute-0 systemd-logind[791]: New session 20 of user zuul.
Oct 01 13:41:17 compute-0 systemd[1]: Started Session 20 of User zuul.
Oct 01 13:41:17 compute-0 sshd-session[76722]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 01 13:41:18 compute-0 python3.9[76875]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 01 13:41:20 compute-0 sudo[77029]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqvjbiyurilpagwedvwzqzrchcolkaat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326079.5490882-80-59289110289722/AnsiballZ_file.py'
Oct 01 13:41:20 compute-0 sudo[77029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:41:20 compute-0 python3.9[77031]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:41:20 compute-0 sudo[77029]: pam_unix(sudo:session): session closed for user root
Oct 01 13:41:20 compute-0 sudo[77181]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewaqcvhqwelncgkveikydmljtsnosoin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326080.502502-80-34407006556166/AnsiballZ_file.py'
Oct 01 13:41:20 compute-0 sudo[77181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:41:20 compute-0 python3.9[77183]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:41:21 compute-0 sudo[77181]: pam_unix(sudo:session): session closed for user root
Oct 01 13:41:21 compute-0 sudo[77333]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzmuuqmnucwwvhjfviriamlpggadsfsv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326081.2125177-109-193085666771229/AnsiballZ_stat.py'
Oct 01 13:41:21 compute-0 sudo[77333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:41:21 compute-0 python3.9[77335]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:41:21 compute-0 sudo[77333]: pam_unix(sudo:session): session closed for user root
Oct 01 13:41:22 compute-0 sudo[77456]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixdabuefbrnrluwahxthbmoollcosmhe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326081.2125177-109-193085666771229/AnsiballZ_copy.py'
Oct 01 13:41:22 compute-0 sudo[77456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:41:22 compute-0 python3.9[77458]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759326081.2125177-109-193085666771229/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=ae477bd8dcc970cdc6aa287250e4c65a411eb79b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:41:22 compute-0 sudo[77456]: pam_unix(sudo:session): session closed for user root
Oct 01 13:41:23 compute-0 sudo[77608]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxlsywfitkdshcmljtqxgmfatxwjghnt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326082.8763182-109-171915620526930/AnsiballZ_stat.py'
Oct 01 13:41:23 compute-0 sudo[77608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:41:23 compute-0 python3.9[77610]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:41:23 compute-0 sudo[77608]: pam_unix(sudo:session): session closed for user root
Oct 01 13:41:23 compute-0 sudo[77731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shzbogwfwmeivrcrcfecmfmoxjykecec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326082.8763182-109-171915620526930/AnsiballZ_copy.py'
Oct 01 13:41:23 compute-0 sudo[77731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:41:24 compute-0 python3.9[77733]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759326082.8763182-109-171915620526930/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=8afac55eb35e843c3840224bc27527d95ba19e51 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:41:24 compute-0 sudo[77731]: pam_unix(sudo:session): session closed for user root
Oct 01 13:41:24 compute-0 sudo[77883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwrpcfigzvuxataxxrcpwkzntojpdjrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326084.3617392-109-20217129065228/AnsiballZ_stat.py'
Oct 01 13:41:24 compute-0 sudo[77883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:41:24 compute-0 python3.9[77885]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:41:24 compute-0 sudo[77883]: pam_unix(sudo:session): session closed for user root
Oct 01 13:41:25 compute-0 sudo[78006]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvoldjsuuyaqltvclhkzrvcnwtslkkwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326084.3617392-109-20217129065228/AnsiballZ_copy.py'
Oct 01 13:41:25 compute-0 sudo[78006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:41:25 compute-0 python3.9[78008]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759326084.3617392-109-20217129065228/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=ef758ff0bd5c1419035a8454dec3c9e795372d8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:41:25 compute-0 sudo[78006]: pam_unix(sudo:session): session closed for user root
Oct 01 13:41:26 compute-0 sudo[78158]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itvhyhxhnwyilnxomtqtbjcbvuxsrdpr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326085.7602851-197-142571271848835/AnsiballZ_file.py'
Oct 01 13:41:26 compute-0 sudo[78158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:41:26 compute-0 python3.9[78160]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:41:26 compute-0 sudo[78158]: pam_unix(sudo:session): session closed for user root
Oct 01 13:41:26 compute-0 sudo[78310]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-goxwiaxwtulplopfthlldccbzsrbsrxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326086.3700275-197-21014835126928/AnsiballZ_file.py'
Oct 01 13:41:26 compute-0 sudo[78310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:41:26 compute-0 python3.9[78312]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:41:26 compute-0 sudo[78310]: pam_unix(sudo:session): session closed for user root
Oct 01 13:41:27 compute-0 sudo[78462]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqnllykskhqxeadqtgcwpggratwgbrtt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326087.1275272-227-92032084023936/AnsiballZ_stat.py'
Oct 01 13:41:27 compute-0 sudo[78462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:41:27 compute-0 python3.9[78464]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:41:27 compute-0 sudo[78462]: pam_unix(sudo:session): session closed for user root
Oct 01 13:41:28 compute-0 sudo[78585]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvzegdwmnsihtgpssbqxoxgziahhtcno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326087.1275272-227-92032084023936/AnsiballZ_copy.py'
Oct 01 13:41:28 compute-0 sudo[78585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:41:28 compute-0 python3.9[78587]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759326087.1275272-227-92032084023936/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=5e4bc98e8271b38618012a057ad2b7af2f13ff3b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:41:28 compute-0 sudo[78585]: pam_unix(sudo:session): session closed for user root
Oct 01 13:41:28 compute-0 sudo[78737]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgzhizmnrnjztzcfucibkametkbnhpas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326088.5108216-227-124336576325269/AnsiballZ_stat.py'
Oct 01 13:41:28 compute-0 sudo[78737]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:41:28 compute-0 python3.9[78739]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:41:28 compute-0 sudo[78737]: pam_unix(sudo:session): session closed for user root
Oct 01 13:41:29 compute-0 sudo[78860]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frrfejwykqlrrsahcgkkkdxkcuamyela ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326088.5108216-227-124336576325269/AnsiballZ_copy.py'
Oct 01 13:41:29 compute-0 sudo[78860]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:41:29 compute-0 python3.9[78862]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759326088.5108216-227-124336576325269/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=2f02fbd438dcd434c9b6ee3b6e1c8fd30387e890 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:41:29 compute-0 sudo[78860]: pam_unix(sudo:session): session closed for user root
Oct 01 13:41:30 compute-0 sudo[79012]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oyirsozkafqlrwcdnrqkqxowlosylpzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326089.76202-227-140377314404307/AnsiballZ_stat.py'
Oct 01 13:41:30 compute-0 sudo[79012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:41:30 compute-0 python3.9[79014]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:41:30 compute-0 sudo[79012]: pam_unix(sudo:session): session closed for user root
Oct 01 13:41:30 compute-0 sudo[79135]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yaprgzntvvvwheumnignbxpykfnyrvpy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326089.76202-227-140377314404307/AnsiballZ_copy.py'
Oct 01 13:41:30 compute-0 sudo[79135]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:41:30 compute-0 python3.9[79137]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759326089.76202-227-140377314404307/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=cf872893f90fd73aeea4ce24e068bbd86db7b68f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:41:31 compute-0 sudo[79135]: pam_unix(sudo:session): session closed for user root
Oct 01 13:41:31 compute-0 sudo[79287]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emspekylorgvgbzztcaqcapuovgctbhi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326091.2530062-314-92934288898537/AnsiballZ_file.py'
Oct 01 13:41:31 compute-0 sudo[79287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:41:31 compute-0 python3.9[79289]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:41:31 compute-0 sudo[79287]: pam_unix(sudo:session): session closed for user root
Oct 01 13:41:32 compute-0 sudo[79439]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqqxxdtczbgwtoieseonezunrnmfmbjz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326092.0307107-314-259345646821048/AnsiballZ_file.py'
Oct 01 13:41:32 compute-0 sudo[79439]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:41:32 compute-0 python3.9[79441]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:41:32 compute-0 sudo[79439]: pam_unix(sudo:session): session closed for user root
Oct 01 13:41:33 compute-0 sudo[79591]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-redmesmelrvfmlsdtlprjpowqijvlmpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326092.8346808-345-246696715539664/AnsiballZ_stat.py'
Oct 01 13:41:33 compute-0 sudo[79591]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:41:33 compute-0 python3.9[79593]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:41:33 compute-0 sudo[79591]: pam_unix(sudo:session): session closed for user root
Oct 01 13:41:33 compute-0 sudo[79714]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vozudjxbdhmlvzzhajfuvjieqgbhavaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326092.8346808-345-246696715539664/AnsiballZ_copy.py'
Oct 01 13:41:33 compute-0 sudo[79714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:41:33 compute-0 python3.9[79716]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759326092.8346808-345-246696715539664/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=14114256b2ee842c7b15e5ebf503670759d4478f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:41:34 compute-0 sudo[79714]: pam_unix(sudo:session): session closed for user root
Oct 01 13:41:34 compute-0 sudo[79866]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mzgpgamnlzkjivcxwtkuldnnvbhemjuo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326094.1148217-345-90668636393080/AnsiballZ_stat.py'
Oct 01 13:41:34 compute-0 sudo[79866]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:41:34 compute-0 python3.9[79868]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:41:34 compute-0 sudo[79866]: pam_unix(sudo:session): session closed for user root
Oct 01 13:41:34 compute-0 sudo[79989]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skvxssvqdqrdtiidusygoqqqhnlmabkd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326094.1148217-345-90668636393080/AnsiballZ_copy.py'
Oct 01 13:41:34 compute-0 sudo[79989]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:41:35 compute-0 python3.9[79991]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759326094.1148217-345-90668636393080/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=1803380d3e4bbf8a76f61308531345e6793046d4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:41:35 compute-0 sudo[79989]: pam_unix(sudo:session): session closed for user root
Oct 01 13:41:35 compute-0 sudo[80141]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkpmkpvuppkxflyqoifuqmuixttpwpoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326095.334456-345-105572732088908/AnsiballZ_stat.py'
Oct 01 13:41:35 compute-0 sudo[80141]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:41:35 compute-0 python3.9[80143]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:41:35 compute-0 sudo[80141]: pam_unix(sudo:session): session closed for user root
Oct 01 13:41:36 compute-0 sudo[80264]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfmkdypzckpobusoznkhedkalpuhmmhe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326095.334456-345-105572732088908/AnsiballZ_copy.py'
Oct 01 13:41:36 compute-0 sudo[80264]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:41:36 compute-0 python3.9[80266]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759326095.334456-345-105572732088908/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=13c71babe6c2ef029139f6a3c726b291c0782524 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:41:36 compute-0 sudo[80264]: pam_unix(sudo:session): session closed for user root
Oct 01 13:41:37 compute-0 sudo[80416]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-babytjesyeefgfsopsvbtsrjgbzgbrqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326096.656898-430-143595436681310/AnsiballZ_file.py'
Oct 01 13:41:37 compute-0 sudo[80416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:41:37 compute-0 python3.9[80418]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:41:37 compute-0 sudo[80416]: pam_unix(sudo:session): session closed for user root
Oct 01 13:41:37 compute-0 sudo[80568]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdpqmukmveiwjdrdxyxcwzqquaijizty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326097.4244335-430-260453402048423/AnsiballZ_file.py'
Oct 01 13:41:37 compute-0 sudo[80568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:41:37 compute-0 python3.9[80570]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:41:38 compute-0 sudo[80568]: pam_unix(sudo:session): session closed for user root
Oct 01 13:41:38 compute-0 sudo[80720]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbnliyndtcqgjbmadvgjbqvcdhujitiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326098.2063458-459-203394307744313/AnsiballZ_stat.py'
Oct 01 13:41:38 compute-0 sudo[80720]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:41:38 compute-0 python3.9[80722]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:41:38 compute-0 sudo[80720]: pam_unix(sudo:session): session closed for user root
Oct 01 13:41:39 compute-0 sudo[80843]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqktbzwqwjvatfwzbycssywaplqtoaut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326098.2063458-459-203394307744313/AnsiballZ_copy.py'
Oct 01 13:41:39 compute-0 sudo[80843]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:41:39 compute-0 python3.9[80845]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759326098.2063458-459-203394307744313/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=7601fbaa083fd5e21cd5eb014d46fd53b7857703 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:41:39 compute-0 sudo[80843]: pam_unix(sudo:session): session closed for user root
Oct 01 13:41:39 compute-0 sudo[80995]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwkmllcppvuhwzpnbkofmgrfspdojexn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326099.6229246-459-92738313515185/AnsiballZ_stat.py'
Oct 01 13:41:39 compute-0 sudo[80995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:41:40 compute-0 python3.9[80997]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:41:40 compute-0 sudo[80995]: pam_unix(sudo:session): session closed for user root
Oct 01 13:41:40 compute-0 sudo[81118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrlnlplihcfnopgwaqsqaorufkbpulyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326099.6229246-459-92738313515185/AnsiballZ_copy.py'
Oct 01 13:41:40 compute-0 sudo[81118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:41:40 compute-0 python3.9[81120]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759326099.6229246-459-92738313515185/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=1803380d3e4bbf8a76f61308531345e6793046d4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:41:40 compute-0 sudo[81118]: pam_unix(sudo:session): session closed for user root
Oct 01 13:41:41 compute-0 sudo[81270]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xoezxrncjrzrspjrkmqmoqtdomayffex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326101.0341-459-47461275423369/AnsiballZ_stat.py'
Oct 01 13:41:41 compute-0 sudo[81270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:41:41 compute-0 python3.9[81272]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:41:41 compute-0 sudo[81270]: pam_unix(sudo:session): session closed for user root
Oct 01 13:41:42 compute-0 sudo[81393]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fgttuuxoqfrieqxitibkfatghsjkhynx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326101.0341-459-47461275423369/AnsiballZ_copy.py'
Oct 01 13:41:42 compute-0 sudo[81393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:41:42 compute-0 python3.9[81395]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759326101.0341-459-47461275423369/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=3193d7040d810026eb9a36ce3e28aa854f7cc45f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:41:42 compute-0 sudo[81393]: pam_unix(sudo:session): session closed for user root
Oct 01 13:41:43 compute-0 sudo[81545]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fyvjuefqtvnqqsvuepuurfgbdztougfo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326102.9301987-567-81032361436026/AnsiballZ_file.py'
Oct 01 13:41:43 compute-0 sudo[81545]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:41:43 compute-0 python3.9[81547]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:41:43 compute-0 sudo[81545]: pam_unix(sudo:session): session closed for user root
Oct 01 13:41:44 compute-0 sudo[81697]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehdqmviermgniehjypvyuhpgzhxrctwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326103.6520436-594-192544538503432/AnsiballZ_stat.py'
Oct 01 13:41:44 compute-0 sudo[81697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:41:44 compute-0 python3.9[81699]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:41:44 compute-0 sudo[81697]: pam_unix(sudo:session): session closed for user root
Oct 01 13:41:44 compute-0 sudo[81820]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mmyrgjlnzckxpsyjwwptbeuqimxjyosr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326103.6520436-594-192544538503432/AnsiballZ_copy.py'
Oct 01 13:41:44 compute-0 sudo[81820]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:41:44 compute-0 python3.9[81822]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759326103.6520436-594-192544538503432/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=768c0bf9fa9273d82e48b91de3840276afe8c79e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:41:44 compute-0 sudo[81820]: pam_unix(sudo:session): session closed for user root
Oct 01 13:41:45 compute-0 sudo[81972]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swgyjkdavqjaejnhwlwbkbktaqhthrmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326105.1892111-628-98757713969129/AnsiballZ_file.py'
Oct 01 13:41:45 compute-0 sudo[81972]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:41:45 compute-0 python3.9[81974]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:41:45 compute-0 sudo[81972]: pam_unix(sudo:session): session closed for user root
Oct 01 13:41:46 compute-0 sudo[82124]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylaxyfxqpmzoqlpunkisuwcotngrnbll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326105.9547863-643-84553163587356/AnsiballZ_stat.py'
Oct 01 13:41:46 compute-0 sudo[82124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:41:46 compute-0 python3.9[82126]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:41:46 compute-0 sudo[82124]: pam_unix(sudo:session): session closed for user root
Oct 01 13:41:46 compute-0 sudo[82247]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkhfargiqrfseltigfrxecbfnivbxdcv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326105.9547863-643-84553163587356/AnsiballZ_copy.py'
Oct 01 13:41:46 compute-0 sudo[82247]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:41:47 compute-0 python3.9[82249]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759326105.9547863-643-84553163587356/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=768c0bf9fa9273d82e48b91de3840276afe8c79e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:41:47 compute-0 sudo[82247]: pam_unix(sudo:session): session closed for user root
Oct 01 13:41:47 compute-0 sudo[82399]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlsweciopqrxjazoqudrmxrsdaydurev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326107.275692-673-135835140587507/AnsiballZ_file.py'
Oct 01 13:41:47 compute-0 sudo[82399]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:41:47 compute-0 python3.9[82401]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:41:47 compute-0 sudo[82399]: pam_unix(sudo:session): session closed for user root
Oct 01 13:41:48 compute-0 sudo[82551]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkbqnykukqcradzoojxvzlpwnewtjvof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326107.996187-687-45948949037922/AnsiballZ_stat.py'
Oct 01 13:41:48 compute-0 sudo[82551]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:41:48 compute-0 python3.9[82553]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:41:48 compute-0 sudo[82551]: pam_unix(sudo:session): session closed for user root
Oct 01 13:41:49 compute-0 sudo[82674]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqnbiswadkdjlxbqvgtrmmwmffqppzed ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326107.996187-687-45948949037922/AnsiballZ_copy.py'
Oct 01 13:41:49 compute-0 sudo[82674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:41:49 compute-0 python3.9[82676]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759326107.996187-687-45948949037922/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=768c0bf9fa9273d82e48b91de3840276afe8c79e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:41:49 compute-0 sudo[82674]: pam_unix(sudo:session): session closed for user root
Oct 01 13:41:49 compute-0 sudo[82826]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-opowcyokcutcddbidxjqxuuhdypwxgwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326109.4662766-719-241617452540511/AnsiballZ_file.py'
Oct 01 13:41:49 compute-0 sudo[82826]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:41:50 compute-0 python3.9[82828]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:41:50 compute-0 sudo[82826]: pam_unix(sudo:session): session closed for user root
Oct 01 13:41:50 compute-0 sudo[82978]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ionqqwbrjnkkohdnkvplvtwnpqqynivs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326110.254785-738-52508109920775/AnsiballZ_stat.py'
Oct 01 13:41:50 compute-0 sudo[82978]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:41:50 compute-0 python3.9[82980]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:41:50 compute-0 sudo[82978]: pam_unix(sudo:session): session closed for user root
Oct 01 13:41:51 compute-0 sudo[83101]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yizkxfhiynqicaqrxphznvzxnzxplzoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326110.254785-738-52508109920775/AnsiballZ_copy.py'
Oct 01 13:41:51 compute-0 sudo[83101]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:41:51 compute-0 python3.9[83103]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759326110.254785-738-52508109920775/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=768c0bf9fa9273d82e48b91de3840276afe8c79e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:41:51 compute-0 sudo[83101]: pam_unix(sudo:session): session closed for user root
Oct 01 13:41:52 compute-0 sudo[83253]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnzexrzbtrwohxmdlaigeusvwkvglfgp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326111.781547-767-3999891801331/AnsiballZ_file.py'
Oct 01 13:41:52 compute-0 sudo[83253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:41:52 compute-0 python3.9[83255]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:41:52 compute-0 sudo[83253]: pam_unix(sudo:session): session closed for user root
Oct 01 13:41:53 compute-0 sudo[83405]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nogibonrbyfbwbecrksqwtjjnxlmexns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326112.647482-783-266675506675891/AnsiballZ_stat.py'
Oct 01 13:41:53 compute-0 sudo[83405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:41:53 compute-0 python3.9[83407]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:41:53 compute-0 sudo[83405]: pam_unix(sudo:session): session closed for user root
Oct 01 13:41:53 compute-0 sudo[83528]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjhgvpwontoggfmtohrnlkfsjpkzykqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326112.647482-783-266675506675891/AnsiballZ_copy.py'
Oct 01 13:41:53 compute-0 sudo[83528]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:41:53 compute-0 python3.9[83530]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759326112.647482-783-266675506675891/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=768c0bf9fa9273d82e48b91de3840276afe8c79e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:41:53 compute-0 sudo[83528]: pam_unix(sudo:session): session closed for user root
Oct 01 13:41:54 compute-0 sudo[83680]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvtxbpfvflirfekuwavenjyyssfdznrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326114.1531703-818-87714091927382/AnsiballZ_file.py'
Oct 01 13:41:54 compute-0 sudo[83680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:41:54 compute-0 python3.9[83682]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:41:54 compute-0 sudo[83680]: pam_unix(sudo:session): session closed for user root
Oct 01 13:41:55 compute-0 sudo[83832]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-olujvifanjhdnapmecokgjwvtczspfqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326114.8327286-835-36545141813101/AnsiballZ_stat.py'
Oct 01 13:41:55 compute-0 sudo[83832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:41:55 compute-0 python3.9[83834]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:41:55 compute-0 sudo[83832]: pam_unix(sudo:session): session closed for user root
Oct 01 13:41:55 compute-0 sudo[83955]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjbcjvfviytnhmbcpqyibeyikpsvortc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326114.8327286-835-36545141813101/AnsiballZ_copy.py'
Oct 01 13:41:55 compute-0 sudo[83955]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:41:56 compute-0 python3.9[83957]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759326114.8327286-835-36545141813101/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=768c0bf9fa9273d82e48b91de3840276afe8c79e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:41:56 compute-0 sudo[83955]: pam_unix(sudo:session): session closed for user root
Oct 01 13:41:56 compute-0 sudo[84107]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbtjbozpoqbfwkaeusmgjajlnkblqvpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326116.3276572-865-210370200517858/AnsiballZ_file.py'
Oct 01 13:41:56 compute-0 sudo[84107]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:41:56 compute-0 python3.9[84109]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:41:56 compute-0 sudo[84107]: pam_unix(sudo:session): session closed for user root
Oct 01 13:41:57 compute-0 sudo[84259]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdcewovggjuwzpanfadbnkdozqtzcgiy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326117.0965137-883-271203969872488/AnsiballZ_stat.py'
Oct 01 13:41:57 compute-0 sudo[84259]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:41:57 compute-0 python3.9[84261]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:41:57 compute-0 sudo[84259]: pam_unix(sudo:session): session closed for user root
Oct 01 13:41:57 compute-0 sudo[84382]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oumvbtwfvuavqalvfeqpxoxxecdcnefb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326117.0965137-883-271203969872488/AnsiballZ_copy.py'
Oct 01 13:41:57 compute-0 sudo[84382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:41:58 compute-0 python3.9[84384]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759326117.0965137-883-271203969872488/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=768c0bf9fa9273d82e48b91de3840276afe8c79e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:41:58 compute-0 sudo[84382]: pam_unix(sudo:session): session closed for user root
Oct 01 13:41:59 compute-0 sshd-session[76725]: Connection closed by 192.168.122.30 port 32870
Oct 01 13:41:59 compute-0 sshd-session[76722]: pam_unix(sshd:session): session closed for user zuul
Oct 01 13:41:59 compute-0 systemd[1]: session-20.scope: Deactivated successfully.
Oct 01 13:41:59 compute-0 systemd[1]: session-20.scope: Consumed 31.898s CPU time.
Oct 01 13:41:59 compute-0 systemd-logind[791]: Session 20 logged out. Waiting for processes to exit.
Oct 01 13:41:59 compute-0 systemd-logind[791]: Removed session 20.
Oct 01 13:42:00 compute-0 PackageKit[31418]: daemon quit
Oct 01 13:42:00 compute-0 systemd[1]: packagekit.service: Deactivated successfully.
Oct 01 13:42:05 compute-0 sshd-session[84411]: Accepted publickey for zuul from 192.168.122.30 port 42584 ssh2: ECDSA SHA256:G/wBH4NemtaB5A4Xrsc6R+GZmi6HC8VbviS/FKhdd8M
Oct 01 13:42:05 compute-0 systemd-logind[791]: New session 21 of user zuul.
Oct 01 13:42:05 compute-0 systemd[1]: Started Session 21 of User zuul.
Oct 01 13:42:05 compute-0 sshd-session[84411]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 01 13:42:06 compute-0 python3.9[84564]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 01 13:42:07 compute-0 sudo[84718]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xyzisligtxxktiucutvbzrgpihuapkac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326126.8886251-48-130518478772391/AnsiballZ_file.py'
Oct 01 13:42:07 compute-0 sudo[84718]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:42:07 compute-0 python3.9[84720]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:42:07 compute-0 sudo[84718]: pam_unix(sudo:session): session closed for user root
Oct 01 13:42:08 compute-0 sudo[84870]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tcjtnsczkghqxuzofhkwwjfzsktawbfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326127.843581-48-27130729730395/AnsiballZ_file.py'
Oct 01 13:42:08 compute-0 sudo[84870]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:42:08 compute-0 python3.9[84872]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:42:08 compute-0 sudo[84870]: pam_unix(sudo:session): session closed for user root
Oct 01 13:42:09 compute-0 python3.9[85022]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 01 13:42:10 compute-0 sudo[85172]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aygzeqtktfhkztwvcntdlbrkjhaondlc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326129.6767883-94-87512188276588/AnsiballZ_seboolean.py'
Oct 01 13:42:10 compute-0 sudo[85172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:42:10 compute-0 python3.9[85174]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Oct 01 13:42:11 compute-0 sudo[85172]: pam_unix(sudo:session): session closed for user root
Oct 01 13:42:12 compute-0 sudo[85328]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfphgwuzkxlarpwtuswyexghlvowtsgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326131.8776767-114-13793911334265/AnsiballZ_setup.py'
Oct 01 13:42:12 compute-0 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Oct 01 13:42:12 compute-0 sudo[85328]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:42:12 compute-0 python3.9[85330]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 01 13:42:12 compute-0 sudo[85328]: pam_unix(sudo:session): session closed for user root
Oct 01 13:42:13 compute-0 sudo[85412]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uowbsdqewyecispcdwnqqerjxrdioqqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326131.8776767-114-13793911334265/AnsiballZ_dnf.py'
Oct 01 13:42:13 compute-0 sudo[85412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:42:13 compute-0 python3.9[85414]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 01 13:42:14 compute-0 sudo[85412]: pam_unix(sudo:session): session closed for user root
Oct 01 13:42:15 compute-0 sudo[85565]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oumqyygzehwjleadbluphcfeqayswcsv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326135.0056198-138-143387866968884/AnsiballZ_systemd.py'
Oct 01 13:42:15 compute-0 sudo[85565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:42:16 compute-0 python3.9[85567]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 01 13:42:16 compute-0 sudo[85565]: pam_unix(sudo:session): session closed for user root
Oct 01 13:42:17 compute-0 sudo[85720]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmqxngejnbciypdcdngyveuzjtynjmwz ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759326136.4542522-154-220489870737079/AnsiballZ_edpm_nftables_snippet.py'
Oct 01 13:42:17 compute-0 sudo[85720]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:42:17 compute-0 python3[85722]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                            rule:
                                              proto: udp
                                              dport: 4789
                                          - rule_name: 119 neutron geneve networks
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              state: ["UNTRACKED"]
                                          - rule_name: 120 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: OUTPUT
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                          - rule_name: 121 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: PREROUTING
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                           dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Oct 01 13:42:17 compute-0 sudo[85720]: pam_unix(sudo:session): session closed for user root
Oct 01 13:42:17 compute-0 sudo[85872]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhyyugymltewchkwqodnznzajaezwcih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326137.5917435-172-139601005366034/AnsiballZ_file.py'
Oct 01 13:42:17 compute-0 sudo[85872]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:42:18 compute-0 python3.9[85874]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:42:18 compute-0 sudo[85872]: pam_unix(sudo:session): session closed for user root
Oct 01 13:42:18 compute-0 sudo[86024]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbvyrhlbsdhgtxbxuubtavmnqnkwmovq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326138.3886938-188-73182821784361/AnsiballZ_stat.py'
Oct 01 13:42:18 compute-0 sudo[86024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:42:19 compute-0 python3.9[86026]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:42:19 compute-0 sudo[86024]: pam_unix(sudo:session): session closed for user root
Oct 01 13:42:19 compute-0 sudo[86102]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-criaawmnfsekhwwxdkmmsmqaynastydu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326138.3886938-188-73182821784361/AnsiballZ_file.py'
Oct 01 13:42:19 compute-0 sudo[86102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:42:19 compute-0 python3.9[86104]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:42:19 compute-0 sudo[86102]: pam_unix(sudo:session): session closed for user root
Oct 01 13:42:20 compute-0 sudo[86254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bohjuxtajddxgbpkyilvhpscdxcvcobd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326139.8913114-212-195764870191411/AnsiballZ_stat.py'
Oct 01 13:42:20 compute-0 sudo[86254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:42:20 compute-0 python3.9[86256]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:42:20 compute-0 sudo[86254]: pam_unix(sudo:session): session closed for user root
Oct 01 13:42:20 compute-0 sudo[86332]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-remvledpbggxcfkwcoyksfybpkuetzsk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326139.8913114-212-195764870191411/AnsiballZ_file.py'
Oct 01 13:42:20 compute-0 sudo[86332]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:42:21 compute-0 python3.9[86334]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.xu9fk77a recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:42:21 compute-0 sudo[86332]: pam_unix(sudo:session): session closed for user root
Oct 01 13:42:21 compute-0 sudo[86484]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-smwrxqgkuqiozkyoeummstkiihbqseri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326141.2119753-236-203957749505264/AnsiballZ_stat.py'
Oct 01 13:42:21 compute-0 sudo[86484]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:42:21 compute-0 python3.9[86486]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:42:21 compute-0 sudo[86484]: pam_unix(sudo:session): session closed for user root
Oct 01 13:42:22 compute-0 sudo[86562]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udziobujwfabzbzvdbgrlmnzcfztqryp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326141.2119753-236-203957749505264/AnsiballZ_file.py'
Oct 01 13:42:22 compute-0 sudo[86562]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:42:22 compute-0 python3.9[86564]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:42:22 compute-0 sudo[86562]: pam_unix(sudo:session): session closed for user root
Oct 01 13:42:23 compute-0 sudo[86714]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-luusoxprovcgvjzmdbtxwxnytijssciu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326142.5518875-262-268576190222827/AnsiballZ_command.py'
Oct 01 13:42:23 compute-0 sudo[86714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:42:23 compute-0 python3.9[86716]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:42:23 compute-0 sudo[86714]: pam_unix(sudo:session): session closed for user root
Oct 01 13:42:24 compute-0 sudo[86867]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwxecgitexgxnpbyhqokxijeevkuqyhe ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759326143.5242906-278-275968144172758/AnsiballZ_edpm_nftables_from_files.py'
Oct 01 13:42:24 compute-0 sudo[86867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:42:24 compute-0 python3[86869]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Oct 01 13:42:24 compute-0 sudo[86867]: pam_unix(sudo:session): session closed for user root
Oct 01 13:42:24 compute-0 sudo[87019]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-luhbufbuyzbtudejtfrsrmhvzdnytjqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326144.518534-294-7883822712464/AnsiballZ_stat.py'
Oct 01 13:42:24 compute-0 sudo[87019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:42:25 compute-0 python3.9[87021]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:42:25 compute-0 sudo[87019]: pam_unix(sudo:session): session closed for user root
Oct 01 13:42:25 compute-0 sudo[87144]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxjrrrxddlhurmezniuymrorfliymmip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326144.518534-294-7883822712464/AnsiballZ_copy.py'
Oct 01 13:42:25 compute-0 sudo[87144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:42:25 compute-0 python3.9[87146]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759326144.518534-294-7883822712464/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:42:25 compute-0 sudo[87144]: pam_unix(sudo:session): session closed for user root
Oct 01 13:42:26 compute-0 sudo[87296]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpjlwktqzzkzinokinnsxswpkotapjnw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326146.0925102-324-10633815558624/AnsiballZ_stat.py'
Oct 01 13:42:26 compute-0 sudo[87296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:42:26 compute-0 python3.9[87298]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:42:26 compute-0 sudo[87296]: pam_unix(sudo:session): session closed for user root
Oct 01 13:42:27 compute-0 sudo[87421]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-reclegdbvmmfqglqqozcdvpsvqmbtpsd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326146.0925102-324-10633815558624/AnsiballZ_copy.py'
Oct 01 13:42:27 compute-0 sudo[87421]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:42:27 compute-0 python3.9[87423]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759326146.0925102-324-10633815558624/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:42:27 compute-0 sudo[87421]: pam_unix(sudo:session): session closed for user root
Oct 01 13:42:27 compute-0 sudo[87573]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbajjnvyncazzutfzacadwmjjcmfaoac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326147.5568736-354-7832050776150/AnsiballZ_stat.py'
Oct 01 13:42:27 compute-0 sudo[87573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:42:28 compute-0 python3.9[87575]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:42:28 compute-0 sudo[87573]: pam_unix(sudo:session): session closed for user root
Oct 01 13:42:28 compute-0 sudo[87698]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yoyuhyoeyqyqprfiuffuwtjqalifvsiy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326147.5568736-354-7832050776150/AnsiballZ_copy.py'
Oct 01 13:42:28 compute-0 sudo[87698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:42:28 compute-0 python3.9[87700]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759326147.5568736-354-7832050776150/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:42:28 compute-0 sudo[87698]: pam_unix(sudo:session): session closed for user root
Oct 01 13:42:29 compute-0 sudo[87850]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jyqzltyrhzbwqoxhjbxqyijkvblnsdjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326149.1039639-384-190758810647626/AnsiballZ_stat.py'
Oct 01 13:42:29 compute-0 sudo[87850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:42:29 compute-0 python3.9[87852]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:42:29 compute-0 sudo[87850]: pam_unix(sudo:session): session closed for user root
Oct 01 13:42:30 compute-0 sudo[87975]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvnjpbigcaaoiuvgvhseuziklwwqvppu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326149.1039639-384-190758810647626/AnsiballZ_copy.py'
Oct 01 13:42:30 compute-0 sudo[87975]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:42:30 compute-0 python3.9[87977]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759326149.1039639-384-190758810647626/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:42:30 compute-0 sudo[87975]: pam_unix(sudo:session): session closed for user root
Oct 01 13:42:31 compute-0 sudo[88127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pattjsbsklhwzkyhvityhewheqbqqcjc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326150.740155-414-20520425024821/AnsiballZ_stat.py'
Oct 01 13:42:31 compute-0 sudo[88127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:42:31 compute-0 python3.9[88129]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:42:31 compute-0 sudo[88127]: pam_unix(sudo:session): session closed for user root
Oct 01 13:42:31 compute-0 sudo[88252]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lefyswdkdgfjslevdfbfrpcxjskcndlc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326150.740155-414-20520425024821/AnsiballZ_copy.py'
Oct 01 13:42:31 compute-0 sudo[88252]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:42:32 compute-0 python3.9[88254]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759326150.740155-414-20520425024821/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:42:32 compute-0 sudo[88252]: pam_unix(sudo:session): session closed for user root
Oct 01 13:42:32 compute-0 sudo[88404]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qofykgapcocpgknvkgqgzayycafwfvou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326152.3987446-444-22190266183131/AnsiballZ_file.py'
Oct 01 13:42:32 compute-0 sudo[88404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:42:33 compute-0 python3.9[88406]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:42:33 compute-0 sudo[88404]: pam_unix(sudo:session): session closed for user root
Oct 01 13:42:33 compute-0 sudo[88556]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kihdlcsmczezhpeobledtfwhhzqsrdyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326153.2020235-460-18364910810824/AnsiballZ_command.py'
Oct 01 13:42:33 compute-0 sudo[88556]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:42:33 compute-0 python3.9[88558]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:42:33 compute-0 sudo[88556]: pam_unix(sudo:session): session closed for user root
Oct 01 13:42:34 compute-0 sudo[88711]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqkovspilvyzvggbosowymhzvqrvmnxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326153.950658-476-194885373242185/AnsiballZ_blockinfile.py'
Oct 01 13:42:34 compute-0 sudo[88711]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:42:34 compute-0 python3.9[88713]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:42:34 compute-0 sudo[88711]: pam_unix(sudo:session): session closed for user root
Oct 01 13:42:35 compute-0 sudo[88863]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ninfagmtmklhqlvrcpiudpxbnmehirvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326154.9090254-494-262577720059490/AnsiballZ_command.py'
Oct 01 13:42:35 compute-0 sudo[88863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:42:35 compute-0 python3.9[88865]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:42:35 compute-0 sudo[88863]: pam_unix(sudo:session): session closed for user root
Oct 01 13:42:36 compute-0 sudo[89016]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkkddjqhzfhejrhwwlbkpebziqzyznla ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326155.733921-510-12698690083697/AnsiballZ_stat.py'
Oct 01 13:42:36 compute-0 sudo[89016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:42:36 compute-0 python3.9[89018]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 01 13:42:36 compute-0 sudo[89016]: pam_unix(sudo:session): session closed for user root
Oct 01 13:42:36 compute-0 sudo[89170]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wudvtagmhccwryaftgvswvgmupggigen ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326156.5535104-526-238490985517633/AnsiballZ_command.py'
Oct 01 13:42:36 compute-0 sudo[89170]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:42:37 compute-0 python3.9[89172]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:42:37 compute-0 sudo[89170]: pam_unix(sudo:session): session closed for user root
Oct 01 13:42:37 compute-0 sudo[89325]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkpifkkemqrdeovrdewqpasnjxsuzkok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326157.3449247-542-140888003858668/AnsiballZ_file.py'
Oct 01 13:42:37 compute-0 sudo[89325]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:42:37 compute-0 python3.9[89327]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:42:37 compute-0 sudo[89325]: pam_unix(sudo:session): session closed for user root
Oct 01 13:42:39 compute-0 python3.9[89477]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 01 13:42:40 compute-0 sudo[89628]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eexfonoolwimqrvupuebpgkegicmyveb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326159.995182-622-274281520478282/AnsiballZ_command.py'
Oct 01 13:42:40 compute-0 sudo[89628]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:42:40 compute-0 python3.9[89630]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:0e:0a:74:f6:ca:ec" external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:42:40 compute-0 ovs-vsctl[89631]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:0e:0a:74:f6:ca:ec external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Oct 01 13:42:40 compute-0 sudo[89628]: pam_unix(sudo:session): session closed for user root
Oct 01 13:42:41 compute-0 sudo[89781]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-npfcmfjqleshviyktnaxqkuhsupzkses ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326160.8775644-640-96294052217939/AnsiballZ_command.py'
Oct 01 13:42:41 compute-0 sudo[89781]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:42:41 compute-0 python3.9[89783]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                            ovs-vsctl show | grep -q "Manager"
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:42:41 compute-0 sudo[89781]: pam_unix(sudo:session): session closed for user root
Oct 01 13:42:41 compute-0 sudo[89936]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovvblqbubfxalnjrquovfovxmqmonjcp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326161.631895-656-2809951548609/AnsiballZ_command.py'
Oct 01 13:42:41 compute-0 sudo[89936]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:42:42 compute-0 python3.9[89938]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:42:42 compute-0 ovs-vsctl[89939]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Oct 01 13:42:42 compute-0 sudo[89936]: pam_unix(sudo:session): session closed for user root
Oct 01 13:42:42 compute-0 python3.9[90089]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 01 13:42:43 compute-0 sudo[90241]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjxuvbpdzoseppuntabtuslygexfqdip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326163.236269-690-142459905404982/AnsiballZ_file.py'
Oct 01 13:42:43 compute-0 sudo[90241]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:42:43 compute-0 python3.9[90243]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:42:43 compute-0 sudo[90241]: pam_unix(sudo:session): session closed for user root
Oct 01 13:42:44 compute-0 sudo[90393]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uowzjxxpoodcpadibpxawmtiyfqszxpq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326164.0249467-706-77545281257179/AnsiballZ_stat.py'
Oct 01 13:42:44 compute-0 sudo[90393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:42:44 compute-0 python3.9[90395]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:42:44 compute-0 sudo[90393]: pam_unix(sudo:session): session closed for user root
Oct 01 13:42:44 compute-0 sudo[90471]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whlwfyinzdrgrkkoeghofmctsixsvatx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326164.0249467-706-77545281257179/AnsiballZ_file.py'
Oct 01 13:42:44 compute-0 sudo[90471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:42:44 compute-0 python3.9[90473]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:42:45 compute-0 sudo[90471]: pam_unix(sudo:session): session closed for user root
Oct 01 13:42:45 compute-0 sudo[90623]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzphslsxbnszmmljhhhsjhkagukgfwaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326165.1764317-706-30118618982213/AnsiballZ_stat.py'
Oct 01 13:42:45 compute-0 sudo[90623]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:42:45 compute-0 python3.9[90625]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:42:45 compute-0 sudo[90623]: pam_unix(sudo:session): session closed for user root
Oct 01 13:42:45 compute-0 sudo[90701]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mzmtgzueaxazqcflepyhmppuqooensix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326165.1764317-706-30118618982213/AnsiballZ_file.py'
Oct 01 13:42:45 compute-0 sudo[90701]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:42:46 compute-0 python3.9[90703]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:42:46 compute-0 sudo[90701]: pam_unix(sudo:session): session closed for user root
Oct 01 13:42:46 compute-0 sudo[90853]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aethkuwmjahpcpniuqpdqmvhhncmaxpm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326166.493015-752-124876632670149/AnsiballZ_file.py'
Oct 01 13:42:46 compute-0 sudo[90853]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:42:46 compute-0 python3.9[90855]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:42:46 compute-0 sudo[90853]: pam_unix(sudo:session): session closed for user root
Oct 01 13:42:47 compute-0 sudo[91005]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ddrpxuucdlyhduxnqqeaoyjutflevenl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326167.1755059-768-64890358382212/AnsiballZ_stat.py'
Oct 01 13:42:47 compute-0 sudo[91005]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:42:47 compute-0 python3.9[91007]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:42:47 compute-0 sudo[91005]: pam_unix(sudo:session): session closed for user root
Oct 01 13:42:47 compute-0 sudo[91083]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kunakixscvywpolpncdbkqdlarzsgaam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326167.1755059-768-64890358382212/AnsiballZ_file.py'
Oct 01 13:42:47 compute-0 sudo[91083]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:42:48 compute-0 python3.9[91085]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:42:48 compute-0 sudo[91083]: pam_unix(sudo:session): session closed for user root
Oct 01 13:42:48 compute-0 sudo[91235]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjscplnqdejueiuaubvodypcjeevydtp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326168.3651097-792-200517236127836/AnsiballZ_stat.py'
Oct 01 13:42:48 compute-0 sudo[91235]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:42:48 compute-0 python3.9[91237]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:42:48 compute-0 sudo[91235]: pam_unix(sudo:session): session closed for user root
Oct 01 13:42:49 compute-0 sudo[91313]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sqxxrscujjvgeezzeqhohjdklebtvhxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326168.3651097-792-200517236127836/AnsiballZ_file.py'
Oct 01 13:42:49 compute-0 sudo[91313]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:42:49 compute-0 python3.9[91315]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:42:49 compute-0 sudo[91313]: pam_unix(sudo:session): session closed for user root
Oct 01 13:42:49 compute-0 sudo[91465]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftkjwakdqtheuexcrzmtfzaktcywrelw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326169.61194-816-625572012902/AnsiballZ_systemd.py'
Oct 01 13:42:49 compute-0 sudo[91465]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:42:50 compute-0 python3.9[91467]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 01 13:42:50 compute-0 systemd[1]: Reloading.
Oct 01 13:42:50 compute-0 systemd-rc-local-generator[91496]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:42:50 compute-0 systemd-sysv-generator[91500]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:42:50 compute-0 sudo[91465]: pam_unix(sudo:session): session closed for user root
Oct 01 13:42:50 compute-0 sudo[91654]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkjkiuftldidxaqygzscgmvdliymdeqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326170.6810954-832-44717822493640/AnsiballZ_stat.py'
Oct 01 13:42:50 compute-0 sudo[91654]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:42:51 compute-0 python3.9[91656]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:42:51 compute-0 sudo[91654]: pam_unix(sudo:session): session closed for user root
Oct 01 13:42:51 compute-0 sudo[91732]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cebicxfnkrbewacyfwocprjamretmcgp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326170.6810954-832-44717822493640/AnsiballZ_file.py'
Oct 01 13:42:51 compute-0 sudo[91732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:42:51 compute-0 python3.9[91734]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:42:51 compute-0 sudo[91732]: pam_unix(sudo:session): session closed for user root
Oct 01 13:42:52 compute-0 sudo[91884]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymtzgffezydchorblbhgfopojkrddfsb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326172.0015254-856-93021643923812/AnsiballZ_stat.py'
Oct 01 13:42:52 compute-0 sudo[91884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:42:52 compute-0 python3.9[91886]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:42:52 compute-0 sudo[91884]: pam_unix(sudo:session): session closed for user root
Oct 01 13:42:52 compute-0 sudo[91962]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zljnkesutjndsamzkjjvxmvxwkvbmikz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326172.0015254-856-93021643923812/AnsiballZ_file.py'
Oct 01 13:42:52 compute-0 sudo[91962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:42:52 compute-0 python3.9[91964]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:42:52 compute-0 sudo[91962]: pam_unix(sudo:session): session closed for user root
Oct 01 13:42:53 compute-0 sudo[92114]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aeogixqtasmmpkuxryeijqgsogsnwxcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326173.2801986-880-70736688039898/AnsiballZ_systemd.py'
Oct 01 13:42:53 compute-0 sudo[92114]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:42:53 compute-0 python3.9[92116]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 01 13:42:54 compute-0 systemd[1]: Reloading.
Oct 01 13:42:54 compute-0 systemd-rc-local-generator[92143]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:42:54 compute-0 systemd-sysv-generator[92147]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:42:54 compute-0 systemd[1]: Starting Create netns directory...
Oct 01 13:42:54 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct 01 13:42:54 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct 01 13:42:54 compute-0 systemd[1]: Finished Create netns directory.
Oct 01 13:42:54 compute-0 sudo[92114]: pam_unix(sudo:session): session closed for user root
Oct 01 13:42:54 compute-0 sudo[92308]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxbqxfekbdvpznlnsbwrfopkhiaitgpe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326174.6462717-900-153352407627495/AnsiballZ_file.py'
Oct 01 13:42:54 compute-0 sudo[92308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:42:55 compute-0 python3.9[92310]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:42:55 compute-0 sudo[92308]: pam_unix(sudo:session): session closed for user root
Oct 01 13:42:55 compute-0 sudo[92460]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-necabjpcixocvodzeggbburarcpptmyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326175.4074457-916-154954543009007/AnsiballZ_stat.py'
Oct 01 13:42:55 compute-0 sudo[92460]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:42:55 compute-0 python3.9[92462]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:42:55 compute-0 sudo[92460]: pam_unix(sudo:session): session closed for user root
Oct 01 13:42:56 compute-0 sudo[92583]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvvmiejizfwysdexxutcmfrqmarrluiy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326175.4074457-916-154954543009007/AnsiballZ_copy.py'
Oct 01 13:42:56 compute-0 sudo[92583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:42:56 compute-0 python3.9[92585]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759326175.4074457-916-154954543009007/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:42:56 compute-0 sudo[92583]: pam_unix(sudo:session): session closed for user root
Oct 01 13:42:57 compute-0 sudo[92735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfddgjggilaqvdqcunvcjjyilmlshlck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326177.0058203-950-109912455700684/AnsiballZ_file.py'
Oct 01 13:42:57 compute-0 sudo[92735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:42:57 compute-0 python3.9[92737]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:42:57 compute-0 sudo[92735]: pam_unix(sudo:session): session closed for user root
Oct 01 13:42:58 compute-0 sudo[92887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ddevanykorpqxiolrbrriafifdtjxrqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326177.841969-966-231463992125537/AnsiballZ_stat.py'
Oct 01 13:42:58 compute-0 sudo[92887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:42:58 compute-0 python3.9[92889]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:42:58 compute-0 sudo[92887]: pam_unix(sudo:session): session closed for user root
Oct 01 13:42:58 compute-0 sudo[93010]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymicgvfsttxadrvnnqvkawtvmfctgqfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326177.841969-966-231463992125537/AnsiballZ_copy.py'
Oct 01 13:42:58 compute-0 sudo[93010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:42:59 compute-0 python3.9[93012]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759326177.841969-966-231463992125537/.source.json _original_basename=.8lt0cu8n follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:42:59 compute-0 sudo[93010]: pam_unix(sudo:session): session closed for user root
Oct 01 13:42:59 compute-0 sudo[93162]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnemyavluyqwzmpklsijplgmxlvhcnuh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326179.3058114-996-22691926117589/AnsiballZ_file.py'
Oct 01 13:42:59 compute-0 sudo[93162]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:42:59 compute-0 python3.9[93164]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:42:59 compute-0 sudo[93162]: pam_unix(sudo:session): session closed for user root
Oct 01 13:43:00 compute-0 sudo[93314]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pakrwfwewznzirfuxpqbhqhktfcmbmov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326180.055078-1012-41654906498794/AnsiballZ_stat.py'
Oct 01 13:43:00 compute-0 sudo[93314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:43:00 compute-0 sudo[93314]: pam_unix(sudo:session): session closed for user root
Oct 01 13:43:00 compute-0 sudo[93437]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjovgvzonlttokwgfzcszgggqfrppoaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326180.055078-1012-41654906498794/AnsiballZ_copy.py'
Oct 01 13:43:00 compute-0 sudo[93437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:43:00 compute-0 sudo[93437]: pam_unix(sudo:session): session closed for user root
Oct 01 13:43:01 compute-0 sudo[93589]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahywdmwmenxxtsgmjqkyokzdnzsrvlll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326181.4195642-1046-34199576036653/AnsiballZ_container_config_data.py'
Oct 01 13:43:01 compute-0 sudo[93589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:43:02 compute-0 python3.9[93591]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Oct 01 13:43:02 compute-0 sudo[93589]: pam_unix(sudo:session): session closed for user root
Oct 01 13:43:02 compute-0 sudo[93741]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dknvrhcuyedlptidzpgggkehwiefglri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326182.2817473-1064-116784017172994/AnsiballZ_container_config_hash.py'
Oct 01 13:43:02 compute-0 sudo[93741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:43:02 compute-0 python3.9[93743]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 01 13:43:03 compute-0 sudo[93741]: pam_unix(sudo:session): session closed for user root
Oct 01 13:43:03 compute-0 sudo[93893]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkgugksdnpqnvfxgttdlhmytnwbpyzil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326183.215669-1082-225996937911945/AnsiballZ_podman_container_info.py'
Oct 01 13:43:03 compute-0 sudo[93893]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:43:03 compute-0 python3.9[93895]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct 01 13:43:03 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 01 13:43:03 compute-0 sudo[93893]: pam_unix(sudo:session): session closed for user root
Oct 01 13:43:04 compute-0 sudo[94057]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mafubzglewetpdhnswaedzrazaaxeosz ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759326184.3710635-1108-167295184425582/AnsiballZ_edpm_container_manage.py'
Oct 01 13:43:04 compute-0 sudo[94057]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:43:05 compute-0 python3[94059]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct 01 13:43:05 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 01 13:43:05 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 01 13:43:05 compute-0 podman[94094]: 2025-10-01 13:43:05.457368556 +0000 UTC m=+0.087473579 container create ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, config_id=ovn_controller, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 01 13:43:05 compute-0 podman[94094]: 2025-10-01 13:43:05.393089928 +0000 UTC m=+0.023194971 image pull c8ef9d5640b125c1f3577d8f712edab51eb0591f40b9f49028ec5b54753f0392 38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest
Oct 01 13:43:05 compute-0 python3[94059]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z 38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest
Oct 01 13:43:05 compute-0 sudo[94057]: pam_unix(sudo:session): session closed for user root
Oct 01 13:43:06 compute-0 sudo[94282]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwratidueusfgmidkokcrsqpvsgmiclx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326185.815534-1124-229800944452996/AnsiballZ_stat.py'
Oct 01 13:43:06 compute-0 sudo[94282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:43:06 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct 01 13:43:06 compute-0 python3.9[94284]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 01 13:43:06 compute-0 sudo[94282]: pam_unix(sudo:session): session closed for user root
Oct 01 13:43:06 compute-0 sudo[94436]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgxxdxwbpguicptcadmsurlkfgjhrauo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326186.659591-1142-15224215794231/AnsiballZ_file.py'
Oct 01 13:43:06 compute-0 sudo[94436]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:43:07 compute-0 python3.9[94438]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:43:07 compute-0 sudo[94436]: pam_unix(sudo:session): session closed for user root
Oct 01 13:43:07 compute-0 sudo[94512]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fdyejmdpmjijswfgankrsfxeafophntq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326186.659591-1142-15224215794231/AnsiballZ_stat.py'
Oct 01 13:43:07 compute-0 sudo[94512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:43:07 compute-0 python3.9[94514]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 01 13:43:07 compute-0 sudo[94512]: pam_unix(sudo:session): session closed for user root
Oct 01 13:43:08 compute-0 sudo[94663]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxzbubwffwqchjedtihlbodxwbpyzlic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326187.782179-1142-148790501731704/AnsiballZ_copy.py'
Oct 01 13:43:08 compute-0 sudo[94663]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:43:08 compute-0 python3.9[94665]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759326187.782179-1142-148790501731704/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:43:08 compute-0 sudo[94663]: pam_unix(sudo:session): session closed for user root
Oct 01 13:43:08 compute-0 sudo[94739]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkpslkcoynldwfxdkjuccllbrmgtfdgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326187.782179-1142-148790501731704/AnsiballZ_systemd.py'
Oct 01 13:43:08 compute-0 sudo[94739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:43:09 compute-0 python3.9[94741]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 01 13:43:09 compute-0 systemd[1]: Reloading.
Oct 01 13:43:09 compute-0 systemd-rc-local-generator[94761]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:43:09 compute-0 systemd-sysv-generator[94769]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:43:09 compute-0 sudo[94739]: pam_unix(sudo:session): session closed for user root
Oct 01 13:43:09 compute-0 sudo[94850]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dezyxaupojiinzbhrstqdorcdkjrubuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326187.782179-1142-148790501731704/AnsiballZ_systemd.py'
Oct 01 13:43:09 compute-0 sudo[94850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:43:10 compute-0 python3.9[94852]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 01 13:43:10 compute-0 systemd[1]: Reloading.
Oct 01 13:43:10 compute-0 systemd-rc-local-generator[94877]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:43:10 compute-0 systemd-sysv-generator[94880]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:43:10 compute-0 systemd[1]: Starting ovn_controller container...
Oct 01 13:43:10 compute-0 systemd[1]: Created slice Virtual Machine and Container Slice.
Oct 01 13:43:10 compute-0 systemd[1]: Started libcrun container.
Oct 01 13:43:10 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4889e839cd2ce4dff7041a1ba426f015dfe28aa73f68d7ae84d767d9cb81e297/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Oct 01 13:43:10 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f.
Oct 01 13:43:10 compute-0 podman[94893]: 2025-10-01 13:43:10.772519323 +0000 UTC m=+0.211576991 container init ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20250930)
Oct 01 13:43:10 compute-0 ovn_controller[94909]: + sudo -E kolla_set_configs
Oct 01 13:43:10 compute-0 podman[94893]: 2025-10-01 13:43:10.811456328 +0000 UTC m=+0.250513986 container start ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_build_tag=watcher_latest, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 01 13:43:10 compute-0 edpm-start-podman-container[94893]: ovn_controller
Oct 01 13:43:10 compute-0 systemd[1]: Created slice User Slice of UID 0.
Oct 01 13:43:10 compute-0 systemd[1]: Starting User Runtime Directory /run/user/0...
Oct 01 13:43:10 compute-0 systemd[1]: Finished User Runtime Directory /run/user/0.
Oct 01 13:43:10 compute-0 edpm-start-podman-container[94892]: Creating additional drop-in dependency for "ovn_controller" (ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f)
Oct 01 13:43:10 compute-0 systemd[1]: Starting User Manager for UID 0...
Oct 01 13:43:10 compute-0 systemd[94943]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Oct 01 13:43:10 compute-0 systemd[1]: Reloading.
Oct 01 13:43:10 compute-0 podman[94916]: 2025-10-01 13:43:10.941849974 +0000 UTC m=+0.112613263 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, config_id=ovn_controller)
Oct 01 13:43:10 compute-0 systemd-rc-local-generator[94999]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:43:11 compute-0 systemd-sysv-generator[95004]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:43:11 compute-0 systemd[94943]: Queued start job for default target Main User Target.
Oct 01 13:43:11 compute-0 systemd[94943]: Created slice User Application Slice.
Oct 01 13:43:11 compute-0 systemd[94943]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Oct 01 13:43:11 compute-0 systemd[94943]: Started Daily Cleanup of User's Temporary Directories.
Oct 01 13:43:11 compute-0 systemd[94943]: Reached target Paths.
Oct 01 13:43:11 compute-0 systemd[94943]: Reached target Timers.
Oct 01 13:43:11 compute-0 systemd[94943]: Starting D-Bus User Message Bus Socket...
Oct 01 13:43:11 compute-0 systemd[94943]: Starting Create User's Volatile Files and Directories...
Oct 01 13:43:11 compute-0 systemd[94943]: Finished Create User's Volatile Files and Directories.
Oct 01 13:43:11 compute-0 systemd[94943]: Listening on D-Bus User Message Bus Socket.
Oct 01 13:43:11 compute-0 systemd[94943]: Reached target Sockets.
Oct 01 13:43:11 compute-0 systemd[94943]: Reached target Basic System.
Oct 01 13:43:11 compute-0 systemd[94943]: Reached target Main User Target.
Oct 01 13:43:11 compute-0 systemd[94943]: Startup finished in 142ms.
Oct 01 13:43:11 compute-0 systemd[1]: Started User Manager for UID 0.
Oct 01 13:43:11 compute-0 systemd[1]: Started ovn_controller container.
Oct 01 13:43:11 compute-0 systemd[1]: ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f-57191ee720f3a445.service: Main process exited, code=exited, status=1/FAILURE
Oct 01 13:43:11 compute-0 systemd[1]: ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f-57191ee720f3a445.service: Failed with result 'exit-code'.
Oct 01 13:43:11 compute-0 systemd[1]: Started Session c1 of User root.
Oct 01 13:43:11 compute-0 sudo[94850]: pam_unix(sudo:session): session closed for user root
Oct 01 13:43:11 compute-0 ovn_controller[94909]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 01 13:43:11 compute-0 ovn_controller[94909]: INFO:__main__:Validating config file
Oct 01 13:43:11 compute-0 ovn_controller[94909]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 01 13:43:11 compute-0 ovn_controller[94909]: INFO:__main__:Writing out command to execute
Oct 01 13:43:11 compute-0 systemd[1]: session-c1.scope: Deactivated successfully.
Oct 01 13:43:11 compute-0 ovn_controller[94909]: ++ cat /run_command
Oct 01 13:43:11 compute-0 ovn_controller[94909]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Oct 01 13:43:11 compute-0 ovn_controller[94909]: + ARGS=
Oct 01 13:43:11 compute-0 ovn_controller[94909]: + sudo kolla_copy_cacerts
Oct 01 13:43:11 compute-0 systemd[1]: Started Session c2 of User root.
Oct 01 13:43:11 compute-0 systemd[1]: session-c2.scope: Deactivated successfully.
Oct 01 13:43:11 compute-0 ovn_controller[94909]: + [[ ! -n '' ]]
Oct 01 13:43:11 compute-0 ovn_controller[94909]: + . kolla_extend_start
Oct 01 13:43:11 compute-0 ovn_controller[94909]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Oct 01 13:43:11 compute-0 ovn_controller[94909]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Oct 01 13:43:11 compute-0 ovn_controller[94909]: + umask 0022
Oct 01 13:43:11 compute-0 ovn_controller[94909]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Oct 01 13:43:11 compute-0 ovn_controller[94909]: 2025-10-01T13:43:11Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Oct 01 13:43:11 compute-0 ovn_controller[94909]: 2025-10-01T13:43:11Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Oct 01 13:43:11 compute-0 ovn_controller[94909]: 2025-10-01T13:43:11Z|00003|main|INFO|OVN internal version is : [24.09.4-20.37.0-77.8]
Oct 01 13:43:11 compute-0 ovn_controller[94909]: 2025-10-01T13:43:11Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Oct 01 13:43:11 compute-0 ovn_controller[94909]: 2025-10-01T13:43:11Z|00005|stream_ssl|ERR|ssl:ovsdbserver-sb.openstack.svc:6642: connect: Address family not supported by protocol
Oct 01 13:43:11 compute-0 ovn_controller[94909]: 2025-10-01T13:43:11Z|00006|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Oct 01 13:43:11 compute-0 ovn_controller[94909]: 2025-10-01T13:43:11Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Address family not supported by protocol)
Oct 01 13:43:11 compute-0 ovn_controller[94909]: 2025-10-01T13:43:11Z|00008|main|INFO|OVNSB IDL reconnected, force recompute.
Oct 01 13:43:11 compute-0 ovn_controller[94909]: 2025-10-01T13:43:11Z|00009|ovn_util|INFO|statctrl: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Oct 01 13:43:11 compute-0 ovn_controller[94909]: 2025-10-01T13:43:11Z|00010|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct 01 13:43:11 compute-0 ovn_controller[94909]: 2025-10-01T13:43:11Z|00011|rconn|WARN|unix:/var/run/openvswitch/br-int.mgmt: connection failed (No such file or directory)
Oct 01 13:43:11 compute-0 ovn_controller[94909]: 2025-10-01T13:43:11Z|00012|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: waiting 1 seconds before reconnect
Oct 01 13:43:11 compute-0 ovn_controller[94909]: 2025-10-01T13:43:11Z|00013|ovn_util|INFO|pinctrl: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Oct 01 13:43:11 compute-0 ovn_controller[94909]: 2025-10-01T13:43:11Z|00014|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct 01 13:43:11 compute-0 ovn_controller[94909]: 2025-10-01T13:43:11Z|00015|rconn|WARN|unix:/var/run/openvswitch/br-int.mgmt: connection failed (No such file or directory)
Oct 01 13:43:11 compute-0 ovn_controller[94909]: 2025-10-01T13:43:11Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: waiting 1 seconds before reconnect
Oct 01 13:43:11 compute-0 NetworkManager[51741]: <info>  [1759326191.3909] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Oct 01 13:43:11 compute-0 NetworkManager[51741]: <info>  [1759326191.3918] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct 01 13:43:11 compute-0 NetworkManager[51741]: <info>  [1759326191.3933] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Oct 01 13:43:11 compute-0 NetworkManager[51741]: <info>  [1759326191.3941] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Oct 01 13:43:11 compute-0 NetworkManager[51741]: <info>  [1759326191.3947] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Oct 01 13:43:11 compute-0 kernel: br-int: entered promiscuous mode
Oct 01 13:43:11 compute-0 systemd-udevd[95069]: Network interface NamePolicy= disabled on kernel command line.
Oct 01 13:43:11 compute-0 sudo[95174]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmxgunlcyzjnwkrypfypsosuqqepgvtw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326191.4350467-1198-266271217060587/AnsiballZ_command.py'
Oct 01 13:43:11 compute-0 sudo[95174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:43:12 compute-0 python3.9[95176]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:43:12 compute-0 ovs-vsctl[95177]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Oct 01 13:43:12 compute-0 sudo[95174]: pam_unix(sudo:session): session closed for user root
Oct 01 13:43:12 compute-0 ovn_controller[94909]: 2025-10-01T13:43:12Z|00001|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct 01 13:43:12 compute-0 ovn_controller[94909]: 2025-10-01T13:43:12Z|00001|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct 01 13:43:12 compute-0 ovn_controller[94909]: 2025-10-01T13:43:12Z|00017|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Oct 01 13:43:12 compute-0 ovn_controller[94909]: 2025-10-01T13:43:12Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct 01 13:43:12 compute-0 ovn_controller[94909]: 2025-10-01T13:43:12Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct 01 13:43:12 compute-0 ovn_controller[94909]: 2025-10-01T13:43:12Z|00018|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Oct 01 13:43:12 compute-0 ovn_controller[94909]: 2025-10-01T13:43:12Z|00019|ovn_util|INFO|features: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Oct 01 13:43:12 compute-0 ovn_controller[94909]: 2025-10-01T13:43:12Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct 01 13:43:12 compute-0 ovn_controller[94909]: 2025-10-01T13:43:12Z|00021|features|INFO|OVS Feature: ct_zero_snat, state: supported
Oct 01 13:43:12 compute-0 ovn_controller[94909]: 2025-10-01T13:43:12Z|00022|features|INFO|OVS Feature: ct_flush, state: supported
Oct 01 13:43:12 compute-0 ovn_controller[94909]: 2025-10-01T13:43:12Z|00023|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Oct 01 13:43:12 compute-0 ovn_controller[94909]: 2025-10-01T13:43:12Z|00024|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Oct 01 13:43:12 compute-0 ovn_controller[94909]: 2025-10-01T13:43:12Z|00025|main|INFO|OVS feature set changed, force recompute.
Oct 01 13:43:12 compute-0 ovn_controller[94909]: 2025-10-01T13:43:12Z|00026|ovn_util|INFO|ofctrl: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Oct 01 13:43:12 compute-0 ovn_controller[94909]: 2025-10-01T13:43:12Z|00027|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct 01 13:43:12 compute-0 ovn_controller[94909]: 2025-10-01T13:43:12Z|00028|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct 01 13:43:12 compute-0 ovn_controller[94909]: 2025-10-01T13:43:12Z|00029|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Oct 01 13:43:12 compute-0 ovn_controller[94909]: 2025-10-01T13:43:12Z|00030|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Oct 01 13:43:12 compute-0 ovn_controller[94909]: 2025-10-01T13:43:12Z|00031|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct 01 13:43:12 compute-0 ovn_controller[94909]: 2025-10-01T13:43:12Z|00032|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Oct 01 13:43:12 compute-0 ovn_controller[94909]: 2025-10-01T13:43:12Z|00033|features|INFO|OVS Feature: meter_support, state: supported
Oct 01 13:43:12 compute-0 ovn_controller[94909]: 2025-10-01T13:43:12Z|00034|features|INFO|OVS Feature: group_support, state: supported
Oct 01 13:43:12 compute-0 ovn_controller[94909]: 2025-10-01T13:43:12Z|00035|main|INFO|OVS feature set changed, force recompute.
Oct 01 13:43:12 compute-0 NetworkManager[51741]: <info>  [1759326192.4595] manager: (ovn-d71f76-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Oct 01 13:43:12 compute-0 ovn_controller[94909]: 2025-10-01T13:43:12Z|00036|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Oct 01 13:43:12 compute-0 ovn_controller[94909]: 2025-10-01T13:43:12Z|00037|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Oct 01 13:43:12 compute-0 NetworkManager[51741]: <info>  [1759326192.4612] manager: (ovn-90ae80-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/18)
Oct 01 13:43:12 compute-0 kernel: genev_sys_6081: entered promiscuous mode
Oct 01 13:43:12 compute-0 systemd-udevd[95076]: Network interface NamePolicy= disabled on kernel command line.
Oct 01 13:43:12 compute-0 NetworkManager[51741]: <info>  [1759326192.4887] device (genev_sys_6081): carrier: link connected
Oct 01 13:43:12 compute-0 NetworkManager[51741]: <info>  [1759326192.4890] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/19)
Oct 01 13:43:12 compute-0 sudo[95331]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-amflfxogaajmfawlvmasedewneyqgnlc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326192.2839239-1214-11673924858674/AnsiballZ_command.py'
Oct 01 13:43:12 compute-0 sudo[95331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:43:12 compute-0 python3.9[95333]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:43:12 compute-0 ovs-vsctl[95335]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Oct 01 13:43:12 compute-0 sudo[95331]: pam_unix(sudo:session): session closed for user root
Oct 01 13:43:13 compute-0 sudo[95486]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-inojfwcewtgggrlgtckjiojtzwvtmoum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326193.4103258-1242-93006744381248/AnsiballZ_command.py'
Oct 01 13:43:13 compute-0 sudo[95486]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:43:14 compute-0 python3.9[95488]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:43:14 compute-0 ovs-vsctl[95489]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Oct 01 13:43:14 compute-0 sudo[95486]: pam_unix(sudo:session): session closed for user root
Oct 01 13:43:14 compute-0 sshd-session[84414]: Connection closed by 192.168.122.30 port 42584
Oct 01 13:43:14 compute-0 sshd-session[84411]: pam_unix(sshd:session): session closed for user zuul
Oct 01 13:43:14 compute-0 systemd[1]: session-21.scope: Deactivated successfully.
Oct 01 13:43:14 compute-0 systemd[1]: session-21.scope: Consumed 50.815s CPU time.
Oct 01 13:43:14 compute-0 systemd-logind[791]: Session 21 logged out. Waiting for processes to exit.
Oct 01 13:43:14 compute-0 systemd-logind[791]: Removed session 21.
Oct 01 13:43:20 compute-0 sshd-session[95514]: Accepted publickey for zuul from 192.168.122.30 port 36416 ssh2: ECDSA SHA256:G/wBH4NemtaB5A4Xrsc6R+GZmi6HC8VbviS/FKhdd8M
Oct 01 13:43:20 compute-0 systemd-logind[791]: New session 23 of user zuul.
Oct 01 13:43:20 compute-0 systemd[1]: Started Session 23 of User zuul.
Oct 01 13:43:20 compute-0 sshd-session[95514]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 01 13:43:21 compute-0 python3.9[95667]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 01 13:43:21 compute-0 systemd[1]: Stopping User Manager for UID 0...
Oct 01 13:43:21 compute-0 systemd[94943]: Activating special unit Exit the Session...
Oct 01 13:43:21 compute-0 systemd[94943]: Stopped target Main User Target.
Oct 01 13:43:21 compute-0 systemd[94943]: Stopped target Basic System.
Oct 01 13:43:21 compute-0 systemd[94943]: Stopped target Paths.
Oct 01 13:43:21 compute-0 systemd[94943]: Stopped target Sockets.
Oct 01 13:43:21 compute-0 systemd[94943]: Stopped target Timers.
Oct 01 13:43:21 compute-0 systemd[94943]: Stopped Daily Cleanup of User's Temporary Directories.
Oct 01 13:43:21 compute-0 systemd[94943]: Closed D-Bus User Message Bus Socket.
Oct 01 13:43:21 compute-0 systemd[94943]: Stopped Create User's Volatile Files and Directories.
Oct 01 13:43:21 compute-0 systemd[94943]: Removed slice User Application Slice.
Oct 01 13:43:21 compute-0 systemd[94943]: Reached target Shutdown.
Oct 01 13:43:21 compute-0 systemd[94943]: Finished Exit the Session.
Oct 01 13:43:21 compute-0 systemd[94943]: Reached target Exit the Session.
Oct 01 13:43:21 compute-0 systemd[1]: user@0.service: Deactivated successfully.
Oct 01 13:43:21 compute-0 systemd[1]: Stopped User Manager for UID 0.
Oct 01 13:43:21 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/0...
Oct 01 13:43:21 compute-0 systemd[1]: run-user-0.mount: Deactivated successfully.
Oct 01 13:43:21 compute-0 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Oct 01 13:43:21 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/0.
Oct 01 13:43:21 compute-0 systemd[1]: Removed slice User Slice of UID 0.
Oct 01 13:43:22 compute-0 sudo[95824]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzrfjertgizgfibtdrgnsxvjxtfwkbxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326201.7586117-48-190362966309557/AnsiballZ_file.py'
Oct 01 13:43:22 compute-0 sudo[95824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:43:22 compute-0 python3.9[95826]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:43:22 compute-0 sudo[95824]: pam_unix(sudo:session): session closed for user root
Oct 01 13:43:23 compute-0 sudo[95976]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tewmmcwuzxiwltkpgpgtizcnusgalsky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326202.6619775-48-36495249090486/AnsiballZ_file.py'
Oct 01 13:43:23 compute-0 sudo[95976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:43:23 compute-0 python3.9[95978]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:43:23 compute-0 sudo[95976]: pam_unix(sudo:session): session closed for user root
Oct 01 13:43:23 compute-0 sudo[96128]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hcjpdyfkucbouytdvgmixbcgqmhjmowz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326203.4119177-48-244858124931044/AnsiballZ_file.py'
Oct 01 13:43:23 compute-0 sudo[96128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:43:23 compute-0 python3.9[96130]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:43:23 compute-0 sudo[96128]: pam_unix(sudo:session): session closed for user root
Oct 01 13:43:24 compute-0 sudo[96280]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-acrmatipbevgajmoaodycfqtwvlaeucf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326204.131369-48-3281192921928/AnsiballZ_file.py'
Oct 01 13:43:24 compute-0 sudo[96280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:43:24 compute-0 python3.9[96282]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:43:24 compute-0 sudo[96280]: pam_unix(sudo:session): session closed for user root
Oct 01 13:43:25 compute-0 sudo[96432]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zsrqxjsetaydzacylgdrcdszohgxeahi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326204.8872945-48-176296922971125/AnsiballZ_file.py'
Oct 01 13:43:25 compute-0 sudo[96432]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:43:25 compute-0 python3.9[96434]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:43:25 compute-0 sudo[96432]: pam_unix(sudo:session): session closed for user root
Oct 01 13:43:26 compute-0 python3.9[96584]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 01 13:43:27 compute-0 sudo[96734]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqygezyhrceybvskuxlveyvldyuvbnkd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326206.5400712-136-275124038041937/AnsiballZ_seboolean.py'
Oct 01 13:43:27 compute-0 sudo[96734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:43:27 compute-0 python3.9[96736]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Oct 01 13:43:27 compute-0 sudo[96734]: pam_unix(sudo:session): session closed for user root
Oct 01 13:43:28 compute-0 python3.9[96886]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:43:29 compute-0 python3.9[97007]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759326208.1535695-152-125800782909867/.source follow=False _original_basename=haproxy.j2 checksum=e770ff414b0fadca51d134a12efcc6c9b048ec99 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:43:30 compute-0 python3.9[97157]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:43:31 compute-0 python3.9[97279]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759326209.8560412-182-90658205219033/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:43:31 compute-0 sudo[97429]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdaujlvpywbvjvbablbhtqsppdrylyzi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326211.458762-216-257303481386764/AnsiballZ_setup.py'
Oct 01 13:43:31 compute-0 sudo[97429]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:43:32 compute-0 python3.9[97431]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 01 13:43:32 compute-0 sudo[97429]: pam_unix(sudo:session): session closed for user root
Oct 01 13:43:32 compute-0 sudo[97513]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pellizmcwbiwqtvmtorowjtmhxayatru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326211.458762-216-257303481386764/AnsiballZ_dnf.py'
Oct 01 13:43:32 compute-0 sudo[97513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:43:33 compute-0 python3.9[97515]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 01 13:43:34 compute-0 sudo[97513]: pam_unix(sudo:session): session closed for user root
Oct 01 13:43:35 compute-0 sudo[97666]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bekuhaxzgxvvzxkvdwirdgfjlolxipem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326214.5757267-240-205964624512511/AnsiballZ_systemd.py'
Oct 01 13:43:35 compute-0 sudo[97666]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:43:35 compute-0 python3.9[97668]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 01 13:43:35 compute-0 sudo[97666]: pam_unix(sudo:session): session closed for user root
Oct 01 13:43:36 compute-0 python3.9[97821]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:43:37 compute-0 python3.9[97942]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759326215.8728473-256-33147799239105/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:43:37 compute-0 python3.9[98092]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:43:38 compute-0 python3.9[98213]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759326217.2601373-256-80780660717240/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:43:39 compute-0 python3.9[98363]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:43:40 compute-0 python3.9[98484]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759326219.462919-344-275605483273235/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:43:41 compute-0 python3.9[98634]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:43:41 compute-0 ovn_controller[94909]: 2025-10-01T13:43:41Z|00038|memory|INFO|16116 kB peak resident set size after 30.6 seconds
Oct 01 13:43:41 compute-0 ovn_controller[94909]: 2025-10-01T13:43:41Z|00039|memory|INFO|idl-cells-OVN_Southbound:256 idl-cells-Open_vSwitch:528 ofctrl_desired_flow_usage-KB:6 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:2
Oct 01 13:43:41 compute-0 podman[98729]: 2025-10-01 13:43:41.946334834 +0000 UTC m=+0.146780474 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Oct 01 13:43:42 compute-0 python3.9[98767]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759326220.8327343-344-101355577237316/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:43:42 compute-0 python3.9[98930]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 01 13:43:43 compute-0 sudo[99082]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phpsqxnmwwwzrzusvqphvmqmffmfrzuu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326223.1560435-420-9059436743362/AnsiballZ_file.py'
Oct 01 13:43:43 compute-0 sudo[99082]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:43:43 compute-0 python3.9[99084]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:43:43 compute-0 sudo[99082]: pam_unix(sudo:session): session closed for user root
Oct 01 13:43:44 compute-0 sudo[99234]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-muxwvbswrzueyyyqurkrlzxdalpfobpq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326223.9575117-436-6592721813071/AnsiballZ_stat.py'
Oct 01 13:43:44 compute-0 sudo[99234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:43:44 compute-0 python3.9[99236]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:43:44 compute-0 sudo[99234]: pam_unix(sudo:session): session closed for user root
Oct 01 13:43:44 compute-0 sudo[99312]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-quyqfubemjtkoapxrtacxtlrcpfjesdu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326223.9575117-436-6592721813071/AnsiballZ_file.py'
Oct 01 13:43:44 compute-0 sudo[99312]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:43:45 compute-0 python3.9[99314]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:43:45 compute-0 sudo[99312]: pam_unix(sudo:session): session closed for user root
Oct 01 13:43:45 compute-0 sudo[99464]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmkuzvleeztqafgjkwhqttztbrlacjfz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326225.2598958-436-93870689665092/AnsiballZ_stat.py'
Oct 01 13:43:45 compute-0 sudo[99464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:43:45 compute-0 python3.9[99466]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:43:45 compute-0 sudo[99464]: pam_unix(sudo:session): session closed for user root
Oct 01 13:43:46 compute-0 sudo[99542]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmhblcpwzkwyrfdwscbjcxvmexmgozga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326225.2598958-436-93870689665092/AnsiballZ_file.py'
Oct 01 13:43:46 compute-0 sudo[99542]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:43:46 compute-0 python3.9[99544]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:43:46 compute-0 sudo[99542]: pam_unix(sudo:session): session closed for user root
Oct 01 13:43:46 compute-0 sudo[99694]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iteamzkvtnkxophnhzkeyfkdbzzwbdbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326226.5329854-482-134199074440544/AnsiballZ_file.py'
Oct 01 13:43:46 compute-0 sudo[99694]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:43:47 compute-0 python3.9[99696]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:43:47 compute-0 sudo[99694]: pam_unix(sudo:session): session closed for user root
Oct 01 13:43:47 compute-0 sudo[99846]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkxxugnmopaqsxkjtbhldlwzgmdubedp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326227.2233956-498-219640113965422/AnsiballZ_stat.py'
Oct 01 13:43:47 compute-0 sudo[99846]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:43:47 compute-0 python3.9[99848]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:43:47 compute-0 sudo[99846]: pam_unix(sudo:session): session closed for user root
Oct 01 13:43:48 compute-0 sudo[99924]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqzrnuwlxhrkfqzhozjqdoiaeftmptws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326227.2233956-498-219640113965422/AnsiballZ_file.py'
Oct 01 13:43:48 compute-0 sudo[99924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:43:48 compute-0 python3.9[99926]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:43:48 compute-0 sudo[99924]: pam_unix(sudo:session): session closed for user root
Oct 01 13:43:48 compute-0 sudo[100076]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wejapkqzyptweqmoiwkpnvdphvjpjqbf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326228.4917514-522-111764024500537/AnsiballZ_stat.py'
Oct 01 13:43:48 compute-0 sudo[100076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:43:49 compute-0 python3.9[100078]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:43:49 compute-0 sudo[100076]: pam_unix(sudo:session): session closed for user root
Oct 01 13:43:49 compute-0 sudo[100154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxukwpueoqkhbhmmvphsbffrkrzsmjpw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326228.4917514-522-111764024500537/AnsiballZ_file.py'
Oct 01 13:43:49 compute-0 sudo[100154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:43:49 compute-0 python3.9[100156]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:43:49 compute-0 sudo[100154]: pam_unix(sudo:session): session closed for user root
Oct 01 13:43:50 compute-0 sudo[100306]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxdhfkvzvufocqpdgsyhmdgwohmnnxbi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326229.7881007-546-144939649249038/AnsiballZ_systemd.py'
Oct 01 13:43:50 compute-0 sudo[100306]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:43:50 compute-0 python3.9[100308]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 01 13:43:50 compute-0 systemd[1]: Reloading.
Oct 01 13:43:50 compute-0 systemd-rc-local-generator[100335]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:43:50 compute-0 systemd-sysv-generator[100339]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:43:50 compute-0 sudo[100306]: pam_unix(sudo:session): session closed for user root
Oct 01 13:43:51 compute-0 sudo[100494]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agyocxrsqvvqvulruftxxldhfhbktxkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326230.9390984-562-15467577018834/AnsiballZ_stat.py'
Oct 01 13:43:51 compute-0 sudo[100494]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:43:51 compute-0 python3.9[100496]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:43:51 compute-0 sudo[100494]: pam_unix(sudo:session): session closed for user root
Oct 01 13:43:51 compute-0 sudo[100572]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wttocbrkiufmfresrzkcmjgrahujszol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326230.9390984-562-15467577018834/AnsiballZ_file.py'
Oct 01 13:43:51 compute-0 sudo[100572]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:43:52 compute-0 python3.9[100574]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:43:52 compute-0 sudo[100572]: pam_unix(sudo:session): session closed for user root
Oct 01 13:43:52 compute-0 sudo[100724]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dozulwahqhbknoiyzlcaexllnavsftqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326232.2164123-586-254976314206034/AnsiballZ_stat.py'
Oct 01 13:43:52 compute-0 sudo[100724]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:43:52 compute-0 python3.9[100726]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:43:52 compute-0 sudo[100724]: pam_unix(sudo:session): session closed for user root
Oct 01 13:43:53 compute-0 sudo[100802]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjkcktgboxopmqolulipgpeifwaytpuk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326232.2164123-586-254976314206034/AnsiballZ_file.py'
Oct 01 13:43:53 compute-0 sudo[100802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:43:53 compute-0 python3.9[100804]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:43:53 compute-0 sudo[100802]: pam_unix(sudo:session): session closed for user root
Oct 01 13:43:53 compute-0 sudo[100955]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tfcdgjlpgexoxrpsmpgziuwclfeummdl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326233.6014106-610-204937111473945/AnsiballZ_systemd.py'
Oct 01 13:43:53 compute-0 sudo[100955]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:43:54 compute-0 python3.9[100957]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 01 13:43:54 compute-0 systemd[1]: Reloading.
Oct 01 13:43:54 compute-0 systemd-rc-local-generator[100986]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:43:54 compute-0 systemd-sysv-generator[100990]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:43:54 compute-0 systemd[1]: Starting Create netns directory...
Oct 01 13:43:54 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct 01 13:43:54 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct 01 13:43:54 compute-0 systemd[1]: Finished Create netns directory.
Oct 01 13:43:54 compute-0 sudo[100955]: pam_unix(sudo:session): session closed for user root
Oct 01 13:43:55 compute-0 sudo[101148]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etxsroqtxiixpslgsbupcnbypnpdzwej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326235.0867896-630-260609220133792/AnsiballZ_file.py'
Oct 01 13:43:55 compute-0 sudo[101148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:43:55 compute-0 python3.9[101150]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:43:55 compute-0 sudo[101148]: pam_unix(sudo:session): session closed for user root
Oct 01 13:43:56 compute-0 sudo[101300]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svsuzmifswbcqzyiaflgvuvcejbotwfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326235.9353244-646-33501961910466/AnsiballZ_stat.py'
Oct 01 13:43:56 compute-0 sudo[101300]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:43:56 compute-0 python3.9[101302]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:43:56 compute-0 sudo[101300]: pam_unix(sudo:session): session closed for user root
Oct 01 13:43:56 compute-0 sudo[101423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fyyatvtizpmoxdncboubvxtrmzldidht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326235.9353244-646-33501961910466/AnsiballZ_copy.py'
Oct 01 13:43:56 compute-0 sudo[101423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:43:57 compute-0 python3.9[101425]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759326235.9353244-646-33501961910466/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:43:57 compute-0 sudo[101423]: pam_unix(sudo:session): session closed for user root
Oct 01 13:43:57 compute-0 sudo[101575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jyktikrtgrtsrewjgnoetiigfgxgnnte ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326237.4859717-680-126234530070590/AnsiballZ_file.py'
Oct 01 13:43:57 compute-0 sudo[101575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:43:57 compute-0 python3.9[101577]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:43:58 compute-0 sudo[101575]: pam_unix(sudo:session): session closed for user root
Oct 01 13:43:58 compute-0 sudo[101727]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wusanhcrdibjskzthnpiinmvugvhdntx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326238.3785374-696-266613373104201/AnsiballZ_stat.py'
Oct 01 13:43:58 compute-0 sudo[101727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:43:58 compute-0 python3.9[101729]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:43:58 compute-0 sudo[101727]: pam_unix(sudo:session): session closed for user root
Oct 01 13:43:59 compute-0 sudo[101850]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwsqlyrcfhwvucnxgjhmvlxlqhsznrdw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326238.3785374-696-266613373104201/AnsiballZ_copy.py'
Oct 01 13:43:59 compute-0 sudo[101850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:43:59 compute-0 python3.9[101852]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759326238.3785374-696-266613373104201/.source.json _original_basename=.s15p6bwp follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:43:59 compute-0 sudo[101850]: pam_unix(sudo:session): session closed for user root
Oct 01 13:44:00 compute-0 sudo[102002]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yiwojeqxtsxodtpbcvzdfnrhiyfjlzgg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326239.760659-726-9216345680172/AnsiballZ_file.py'
Oct 01 13:44:00 compute-0 sudo[102002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:44:00 compute-0 python3.9[102004]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:44:00 compute-0 sudo[102002]: pam_unix(sudo:session): session closed for user root
Oct 01 13:44:00 compute-0 sudo[102154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlkhdijaqupaqdykxljdlbblwacpzkhd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326240.5965376-742-233826695075957/AnsiballZ_stat.py'
Oct 01 13:44:00 compute-0 sudo[102154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:44:01 compute-0 sudo[102154]: pam_unix(sudo:session): session closed for user root
Oct 01 13:44:01 compute-0 sudo[102277]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrugvlstsnbdtsjqeeplvsfjdljpiwgp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326240.5965376-742-233826695075957/AnsiballZ_copy.py'
Oct 01 13:44:01 compute-0 sudo[102277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:44:01 compute-0 sudo[102277]: pam_unix(sudo:session): session closed for user root
Oct 01 13:44:02 compute-0 sudo[102429]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kxfvybanfddlspvynpodqaptphcbouik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326242.284527-776-120276936566310/AnsiballZ_container_config_data.py'
Oct 01 13:44:02 compute-0 sudo[102429]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:44:03 compute-0 python3.9[102431]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Oct 01 13:44:03 compute-0 sudo[102429]: pam_unix(sudo:session): session closed for user root
Oct 01 13:44:03 compute-0 sudo[102581]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvhgzothuojmhcbuyovyhpbqnoyvjsux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326243.3393252-794-232883208616631/AnsiballZ_container_config_hash.py'
Oct 01 13:44:03 compute-0 sudo[102581]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:44:04 compute-0 python3.9[102583]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 01 13:44:04 compute-0 sudo[102581]: pam_unix(sudo:session): session closed for user root
Oct 01 13:44:04 compute-0 sudo[102733]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slheetxbyzextlznhobewictgpqimalh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326244.3588543-812-105980736770920/AnsiballZ_podman_container_info.py'
Oct 01 13:44:04 compute-0 sudo[102733]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:44:05 compute-0 python3.9[102735]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct 01 13:44:05 compute-0 sudo[102733]: pam_unix(sudo:session): session closed for user root
Oct 01 13:44:06 compute-0 sudo[102911]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xectkzlznbmxvvdlaxqtgyzesdltcmuc ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759326245.8771381-838-204005436034086/AnsiballZ_edpm_container_manage.py'
Oct 01 13:44:06 compute-0 sudo[102911]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:44:06 compute-0 python3[102913]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct 01 13:44:06 compute-0 podman[102948]: 2025-10-01 13:44:06.916481766 +0000 UTC m=+0.044437905 container create 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20250930, io.buildah.version=1.41.4)
Oct 01 13:44:06 compute-0 podman[102948]: 2025-10-01 13:44:06.891734843 +0000 UTC m=+0.019690982 image pull 0c139338a67144a0d88e07ef5f38b20d3085af4a1586fd8115d3776c8f9c633c 38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Oct 01 13:44:06 compute-0 python3[102913]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z 38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Oct 01 13:44:07 compute-0 sudo[102911]: pam_unix(sudo:session): session closed for user root
Oct 01 13:44:07 compute-0 sudo[103136]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zkitityfiropaulsvwggqezfzkqaexwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326247.3005962-854-96608395916009/AnsiballZ_stat.py'
Oct 01 13:44:07 compute-0 sudo[103136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:44:07 compute-0 python3.9[103138]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 01 13:44:07 compute-0 sudo[103136]: pam_unix(sudo:session): session closed for user root
Oct 01 13:44:08 compute-0 sudo[103290]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzpaurzrvlazubieoxaeuurywzleefvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326248.2225041-872-156591673938719/AnsiballZ_file.py'
Oct 01 13:44:08 compute-0 sudo[103290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:44:08 compute-0 python3.9[103292]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:44:08 compute-0 sudo[103290]: pam_unix(sudo:session): session closed for user root
Oct 01 13:44:09 compute-0 sudo[103366]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmzduruzjmnrjnamaiukioolzbrjglsf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326248.2225041-872-156591673938719/AnsiballZ_stat.py'
Oct 01 13:44:09 compute-0 sudo[103366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:44:09 compute-0 python3.9[103368]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 01 13:44:09 compute-0 sudo[103366]: pam_unix(sudo:session): session closed for user root
Oct 01 13:44:09 compute-0 sudo[103517]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kxqdbpvdaulgocaywxgoritjgrqtrvbh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326249.324443-872-43940774823855/AnsiballZ_copy.py'
Oct 01 13:44:09 compute-0 sudo[103517]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:44:10 compute-0 python3.9[103519]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759326249.324443-872-43940774823855/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:44:10 compute-0 sudo[103517]: pam_unix(sudo:session): session closed for user root
Oct 01 13:44:10 compute-0 sudo[103593]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sffseahuekdetiovouvqtfxdmgbngssw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326249.324443-872-43940774823855/AnsiballZ_systemd.py'
Oct 01 13:44:10 compute-0 sudo[103593]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:44:10 compute-0 python3.9[103595]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 01 13:44:10 compute-0 systemd[1]: Reloading.
Oct 01 13:44:10 compute-0 systemd-rc-local-generator[103623]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:44:10 compute-0 systemd-sysv-generator[103626]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:44:10 compute-0 sudo[103593]: pam_unix(sudo:session): session closed for user root
Oct 01 13:44:11 compute-0 sudo[103704]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itadeujcmeharwlpmlmptulalotpzstw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326249.324443-872-43940774823855/AnsiballZ_systemd.py'
Oct 01 13:44:11 compute-0 sudo[103704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:44:11 compute-0 python3.9[103706]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 01 13:44:11 compute-0 systemd[1]: Reloading.
Oct 01 13:44:11 compute-0 systemd-sysv-generator[103740]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:44:11 compute-0 systemd-rc-local-generator[103737]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:44:11 compute-0 systemd[1]: Starting ovn_metadata_agent container...
Oct 01 13:44:12 compute-0 systemd[1]: Started libcrun container.
Oct 01 13:44:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/59b11dca795a95880a0b1579af543a20d41ab4e05650728a05f0cdfd2906e50a/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Oct 01 13:44:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/59b11dca795a95880a0b1579af543a20d41ab4e05650728a05f0cdfd2906e50a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 01 13:44:12 compute-0 podman[103745]: 2025-10-01 13:44:12.169465449 +0000 UTC m=+0.161044543 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct 01 13:44:12 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3.
Oct 01 13:44:12 compute-0 podman[103748]: 2025-10-01 13:44:12.187510168 +0000 UTC m=+0.167603384 container init 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=ovn_metadata_agent)
Oct 01 13:44:12 compute-0 ovn_metadata_agent[103777]: + sudo -E kolla_set_configs
Oct 01 13:44:12 compute-0 podman[103748]: 2025-10-01 13:44:12.231329866 +0000 UTC m=+0.211423042 container start 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Oct 01 13:44:12 compute-0 edpm-start-podman-container[103748]: ovn_metadata_agent
Oct 01 13:44:12 compute-0 ovn_metadata_agent[103777]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 01 13:44:12 compute-0 ovn_metadata_agent[103777]: INFO:__main__:Validating config file
Oct 01 13:44:12 compute-0 ovn_metadata_agent[103777]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 01 13:44:12 compute-0 ovn_metadata_agent[103777]: INFO:__main__:Copying service configuration files
Oct 01 13:44:12 compute-0 ovn_metadata_agent[103777]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Oct 01 13:44:12 compute-0 ovn_metadata_agent[103777]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Oct 01 13:44:12 compute-0 ovn_metadata_agent[103777]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Oct 01 13:44:12 compute-0 ovn_metadata_agent[103777]: INFO:__main__:Writing out command to execute
Oct 01 13:44:12 compute-0 ovn_metadata_agent[103777]: INFO:__main__:Setting permission for /var/lib/neutron
Oct 01 13:44:12 compute-0 ovn_metadata_agent[103777]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Oct 01 13:44:12 compute-0 ovn_metadata_agent[103777]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Oct 01 13:44:12 compute-0 ovn_metadata_agent[103777]: INFO:__main__:Setting permission for /var/lib/neutron/external
Oct 01 13:44:12 compute-0 ovn_metadata_agent[103777]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Oct 01 13:44:12 compute-0 ovn_metadata_agent[103777]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Oct 01 13:44:12 compute-0 ovn_metadata_agent[103777]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Oct 01 13:44:12 compute-0 ovn_metadata_agent[103777]: ++ cat /run_command
Oct 01 13:44:12 compute-0 ovn_metadata_agent[103777]: + CMD=neutron-ovn-metadata-agent
Oct 01 13:44:12 compute-0 ovn_metadata_agent[103777]: + ARGS=
Oct 01 13:44:12 compute-0 ovn_metadata_agent[103777]: + sudo kolla_copy_cacerts
Oct 01 13:44:12 compute-0 podman[103793]: 2025-10-01 13:44:12.335546024 +0000 UTC m=+0.081991841 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent)
Oct 01 13:44:12 compute-0 edpm-start-podman-container[103747]: Creating additional drop-in dependency for "ovn_metadata_agent" (3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3)
Oct 01 13:44:12 compute-0 ovn_metadata_agent[103777]: + [[ ! -n '' ]]
Oct 01 13:44:12 compute-0 ovn_metadata_agent[103777]: + . kolla_extend_start
Oct 01 13:44:12 compute-0 ovn_metadata_agent[103777]: Running command: 'neutron-ovn-metadata-agent'
Oct 01 13:44:12 compute-0 ovn_metadata_agent[103777]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Oct 01 13:44:12 compute-0 ovn_metadata_agent[103777]: + umask 0022
Oct 01 13:44:12 compute-0 ovn_metadata_agent[103777]: + exec neutron-ovn-metadata-agent
Oct 01 13:44:12 compute-0 systemd[1]: Reloading.
Oct 01 13:44:12 compute-0 systemd-rc-local-generator[103865]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:44:12 compute-0 systemd-sysv-generator[103870]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:44:12 compute-0 systemd[1]: Started ovn_metadata_agent container.
Oct 01 13:44:12 compute-0 sudo[103704]: pam_unix(sudo:session): session closed for user root
Oct 01 13:44:13 compute-0 sshd-session[95517]: Connection closed by 192.168.122.30 port 36416
Oct 01 13:44:13 compute-0 sshd-session[95514]: pam_unix(sshd:session): session closed for user zuul
Oct 01 13:44:13 compute-0 systemd[1]: session-23.scope: Deactivated successfully.
Oct 01 13:44:13 compute-0 systemd[1]: session-23.scope: Consumed 39.976s CPU time.
Oct 01 13:44:13 compute-0 systemd-logind[791]: Session 23 logged out. Waiting for processes to exit.
Oct 01 13:44:13 compute-0 systemd-logind[791]: Removed session 23.
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.147 103791 INFO neutron.common.config [-] Logging enabled!
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.147 103791 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 26.1.0.dev268
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.147 103791 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.12/site-packages/neutron/common/config.py:124
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.148 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2804
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.148 103791 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2805
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.148 103791 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2806
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.148 103791 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2807
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.148 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2809
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.148 103791 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.148 103791 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.148 103791 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.148 103791 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.149 103791 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.149 103791 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.149 103791 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.149 103791 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.149 103791 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.149 103791 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.149 103791 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.149 103791 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.149 103791 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.149 103791 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.149 103791 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.149 103791 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.149 103791 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.149 103791 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.149 103791 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.150 103791 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.150 103791 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.150 103791 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.150 103791 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.150 103791 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.150 103791 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.150 103791 DEBUG neutron.agent.ovn.metadata_agent [-] enable_signals                 = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.150 103791 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.150 103791 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.150 103791 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.150 103791 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.150 103791 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.150 103791 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.150 103791 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.151 103791 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.151 103791 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.151 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.151 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.151 103791 DEBUG neutron.agent.ovn.metadata_agent [-] log_color                      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.151 103791 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.151 103791 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.151 103791 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.151 103791 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.151 103791 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.151 103791 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.151 103791 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.151 103791 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.151 103791 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.151 103791 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.151 103791 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.152 103791 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.152 103791 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.152 103791 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.152 103791 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.152 103791 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.152 103791 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.152 103791 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.152 103791 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.152 103791 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.152 103791 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.152 103791 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.152 103791 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.152 103791 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.152 103791 DEBUG neutron.agent.ovn.metadata_agent [-] my_ip                          = 38.102.83.163 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.153 103791 DEBUG neutron.agent.ovn.metadata_agent [-] my_ipv6                        = ::1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.153 103791 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.153 103791 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.153 103791 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.153 103791 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.153 103791 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.153 103791 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.153 103791 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.153 103791 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.153 103791 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.153 103791 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.153 103791 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.153 103791 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.153 103791 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.154 103791 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.154 103791 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.154 103791 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.154 103791 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.154 103791 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.154 103791 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.154 103791 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.154 103791 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.154 103791 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.154 103791 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.154 103791 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.154 103791 DEBUG neutron.agent.ovn.metadata_agent [-] shell_completion               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.154 103791 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.154 103791 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.154 103791 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.154 103791 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.155 103791 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.155 103791 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.155 103791 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.155 103791 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.155 103791 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.155 103791 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_qinq                      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.155 103791 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.155 103791 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.155 103791 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.155 103791 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.155 103791 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.155 103791 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.155 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.155 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.155 103791 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.156 103791 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.156 103791 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.156 103791 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.156 103791 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.156 103791 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.156 103791 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.156 103791 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.156 103791 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.156 103791 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_requests        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.156 103791 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.156 103791 DEBUG neutron.agent.ovn.metadata_agent [-] profiler_jaeger.process_tags   = {} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.156 103791 DEBUG neutron.agent.ovn.metadata_agent [-] profiler_jaeger.service_name_prefix = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.156 103791 DEBUG neutron.agent.ovn.metadata_agent [-] profiler_otlp.service_name_prefix = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.156 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.157 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.157 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.157 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.157 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.157 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.157 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.157 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.157 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.157 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.157 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.157 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.157 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.157 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.157 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.157 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_timeout     = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.158 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.158 103791 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.158 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.158 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.158 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.158 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.log_daemon_traceback   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.158 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.158 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.158 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.158 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.158 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.158 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.158 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.158 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.158 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.158 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.159 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.159 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.159 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.159 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.159 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.159 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.159 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.159 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.159 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.159 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.159 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.159 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.159 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.159 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.159 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.160 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.160 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.160 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.160 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.160 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.160 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.160 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.160 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.160 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.160 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.160 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.160 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.160 103791 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.160 103791 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.160 103791 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.161 103791 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.161 103791 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.161 103791 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.161 103791 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.161 103791 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.161 103791 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.161 103791 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.161 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.161 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mappings            = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.161 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.datapath_type              = system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.161 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_flood                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.161 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_flood_reports         = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.161 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_flood_unregistered    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.161 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.161 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.int_peer_patch_port        = patch-tun log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.161 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.integration_bridge         = br-int log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.162 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.local_ip                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.162 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_connect_timeout         = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.162 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_inactivity_probe        = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.162 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_listen_address          = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.162 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_listen_port             = 6633 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.162 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_request_timeout         = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.162 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.openflow_processed_per_port = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.162 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.162 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_debug                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.162 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.162 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.qos_meter_bandwidth        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.162 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_bandwidths = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.162 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_default_hypervisor = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.162 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_hypervisors = {} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.162 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_inventory_defaults = {'allocation_ratio': 1.0, 'min_unit': 1, 'step_size': 1, 'reserved': 0} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.163 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_packet_processing_inventory_defaults = {'allocation_ratio': 1.0, 'min_unit': 1, 'step_size': 1, 'reserved': 0} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.163 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_packet_processing_with_direction = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.163 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_packet_processing_without_direction = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.163 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ssl_ca_cert_file           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.163 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ssl_cert_file              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.163 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ssl_key_file               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.163 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.tun_peer_patch_port        = patch-int log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.163 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.tunnel_bridge              = br-tun log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.163 103791 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.vhostuser_socket_dir       = /var/run/openvswitch log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.163 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.163 103791 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.163 103791 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.163 103791 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.163 103791 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.163 103791 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.164 103791 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.164 103791 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.164 103791 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.164 103791 DEBUG neutron.agent.ovn.metadata_agent [-] agent.extensions               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.164 103791 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.164 103791 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.164 103791 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.164 103791 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.164 103791 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.164 103791 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.164 103791 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.164 103791 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.164 103791 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.164 103791 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.164 103791 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.164 103791 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.165 103791 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.165 103791 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.165 103791 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.165 103791 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.165 103791 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.165 103791 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.165 103791 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.165 103791 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.165 103791 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.165 103791 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.165 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.165 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.165 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.165 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.165 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.166 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.166 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.166 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.166 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.166 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.166 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.166 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.166 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.166 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.166 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.166 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.166 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.166 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.166 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.166 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.167 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.167 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.167 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.167 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.167 103791 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.167 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.broadcast_arps_to_all_routers = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.167 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.167 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.167 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_records_ovn_owned      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.167 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.167 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.167 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.fdb_age_threshold          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.167 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.live_migration_activation_strategy = rarp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.167 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.localnet_learn_fdb         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.168 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.mac_binding_age_threshold  = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.168 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.168 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.168 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.168 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.168 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.168 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.168 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.168 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.168 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = ['tcp:127.0.0.1:6641'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.168 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.168 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_router_indirect_snat   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.168 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.168 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.168 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ['ssl:ovsdbserver-sb.openstack.svc:6642'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.169 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.169 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.169 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.169 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.169 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.169 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.169 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn_nb_global.fdb_removal_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.169 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn_nb_global.ignore_lsp_down  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.169 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovn_nb_global.mac_binding_removal_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.169 103791 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.base_query_rate_limit = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.169 103791 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.base_window_duration = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.169 103791 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.burst_query_rate_limit = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.169 103791 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.burst_window_duration = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.169 103791 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.ip_versions = [4] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.169 103791 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.rate_limit_enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.170 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.170 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.170 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.170 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.170 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.170 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.170 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.170 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.170 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.170 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.170 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.170 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.hostname = compute-0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.170 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.170 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.170 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.171 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.171 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_splay = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.171 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.processname = neutron-ovn-metadata-agent log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.171 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.171 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.171 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.171 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.171 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.171 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.171 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.171 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.171 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.171 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.171 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_stream_fanout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.171 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.172 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_quorum_queue = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.172 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.172 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.172 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.172 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.172 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.172 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.172 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.172 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.use_queue_manager = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.172 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.172 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.172 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.172 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.172 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_reports.file_event_handler = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.172 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.172 103791 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.172 103791 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2828
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.180 103791 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.180 103791 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.180 103791 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.181 103791 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.181 103791 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.189 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 10cf9814-09fa-4bad-879a-270f9b64eda3 (UUID: 10cf9814-09fa-4bad-879a-270f9b64eda3) and ovn bridge br-int. _load_config /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:419
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.211 103791 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.212 103791 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.212 103791 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Port_Binding.logical_port autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.212 103791 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.212 103791 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.215 103791 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.220 103791 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.228 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '10cf9814-09fa-4bad-879a-270f9b64eda3'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], external_ids={}, name=10cf9814-09fa-4bad-879a-270f9b64eda3, nb_cfg_timestamp=1759326200417, nb_cfg=1) old= matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.230 103791 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpd1fcycrm/privsep.sock']
Oct 01 13:44:14 compute-0 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.950 103791 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.950 103791 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpd1fcycrm/privsep.sock __init__ /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:377
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.816 103910 INFO oslo.privsep.daemon [-] privsep daemon starting
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.822 103910 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.826 103910 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.826 103910 INFO oslo.privsep.daemon [-] privsep daemon running as pid 103910
Oct 01 13:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:14.952 103910 DEBUG oslo.privsep.daemon [-] privsep: reply[dd5022f4-f4fa-4d5a-ada2-2ea1bdaf6feb]: (2,) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 13:44:15 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:15.385 103910 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 13:44:15 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:15.385 103910 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 13:44:15 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:15.385 103910 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 13:44:15 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:15.815 103910 INFO oslo_service.backend [-] Loading backend: eventlet
Oct 01 13:44:15 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:15.820 103910 INFO oslo_service.backend [-] Backend 'eventlet' successfully loaded and cached.
Oct 01 13:44:15 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:15.856 103910 DEBUG oslo.privsep.daemon [-] privsep: reply[e233472c-d45d-468b-b2a2-89a4980857ed]: (4, []) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 13:44:15 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:15.857 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=10cf9814-09fa-4bad-879a-270f9b64eda3, column=external_ids, values=({'neutron:ovn-metadata-id': 'b3582db4-5408-5940-baf5-17446c851dae'},)) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 13:44:15 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:15.871 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=10cf9814-09fa-4bad-879a-270f9b64eda3, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 13:44:15 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:44:15.880 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=10cf9814-09fa-4bad-879a-270f9b64eda3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '1'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 13:44:18 compute-0 sshd-session[103915]: Accepted publickey for zuul from 192.168.122.30 port 44150 ssh2: ECDSA SHA256:G/wBH4NemtaB5A4Xrsc6R+GZmi6HC8VbviS/FKhdd8M
Oct 01 13:44:18 compute-0 systemd-logind[791]: New session 24 of user zuul.
Oct 01 13:44:18 compute-0 systemd[1]: Started Session 24 of User zuul.
Oct 01 13:44:18 compute-0 sshd-session[103915]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 01 13:44:19 compute-0 python3.9[104068]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 01 13:44:20 compute-0 sudo[104222]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lyhqvyquhqjdiwypbhpxwbkbddqyslgt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326260.0150435-48-202754756268811/AnsiballZ_command.py'
Oct 01 13:44:20 compute-0 sudo[104222]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:44:20 compute-0 python3.9[104224]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:44:20 compute-0 sudo[104222]: pam_unix(sudo:session): session closed for user root
Oct 01 13:44:21 compute-0 sudo[104387]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihssnuungeplrkrkeumhvjwowktdzlui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326261.386819-70-205689169007245/AnsiballZ_systemd_service.py'
Oct 01 13:44:21 compute-0 sudo[104387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:44:22 compute-0 python3.9[104389]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 01 13:44:22 compute-0 systemd[1]: Reloading.
Oct 01 13:44:22 compute-0 systemd-rc-local-generator[104411]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:44:22 compute-0 systemd-sysv-generator[104414]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:44:22 compute-0 sudo[104387]: pam_unix(sudo:session): session closed for user root
Oct 01 13:44:23 compute-0 python3.9[104574]: ansible-ansible.builtin.service_facts Invoked
Oct 01 13:44:23 compute-0 network[104591]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 01 13:44:23 compute-0 network[104592]: 'network-scripts' will be removed from distribution in near future.
Oct 01 13:44:23 compute-0 network[104593]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 01 13:44:28 compute-0 sudo[104855]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbkyqhuuvdignywteimmowxqdgbvwxrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326267.9801803-108-34784802221966/AnsiballZ_systemd_service.py'
Oct 01 13:44:28 compute-0 sudo[104855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:44:28 compute-0 python3.9[104857]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 01 13:44:28 compute-0 sudo[104855]: pam_unix(sudo:session): session closed for user root
Oct 01 13:44:29 compute-0 sudo[105008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmqwinynpsogxedpxmysciqoitxaouor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326268.885446-108-60913541429470/AnsiballZ_systemd_service.py'
Oct 01 13:44:29 compute-0 sudo[105008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:44:29 compute-0 python3.9[105010]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 01 13:44:29 compute-0 sudo[105008]: pam_unix(sudo:session): session closed for user root
Oct 01 13:44:30 compute-0 sudo[105161]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-epuxivbvgrjegpeieqwrkzzgxdhmhzbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326269.7900465-108-12141589296025/AnsiballZ_systemd_service.py'
Oct 01 13:44:30 compute-0 sudo[105161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:44:30 compute-0 python3.9[105163]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 01 13:44:30 compute-0 sudo[105161]: pam_unix(sudo:session): session closed for user root
Oct 01 13:44:31 compute-0 sudo[105314]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qsvyrcrapsjsvgbxtxpglftodbeuntue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326270.6650164-108-252589529398270/AnsiballZ_systemd_service.py'
Oct 01 13:44:31 compute-0 sudo[105314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:44:31 compute-0 python3.9[105316]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 01 13:44:31 compute-0 sudo[105314]: pam_unix(sudo:session): session closed for user root
Oct 01 13:44:31 compute-0 sudo[105467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fgarturpuvnuetvnmbtuwunvpabsipgd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326271.5307963-108-190923215873459/AnsiballZ_systemd_service.py'
Oct 01 13:44:31 compute-0 sudo[105467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:44:32 compute-0 python3.9[105469]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 01 13:44:32 compute-0 sudo[105467]: pam_unix(sudo:session): session closed for user root
Oct 01 13:44:32 compute-0 sudo[105620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wftehrjawbadwlzxhjepgpgakidlwurc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326272.4028926-108-93817411787440/AnsiballZ_systemd_service.py'
Oct 01 13:44:32 compute-0 sudo[105620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:44:33 compute-0 python3.9[105622]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 01 13:44:33 compute-0 sudo[105620]: pam_unix(sudo:session): session closed for user root
Oct 01 13:44:33 compute-0 sudo[105773]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-alaujzmlyfjwoplklxasermfvbvxqhio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326273.2607327-108-15466496992015/AnsiballZ_systemd_service.py'
Oct 01 13:44:33 compute-0 sudo[105773]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:44:33 compute-0 python3.9[105775]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 01 13:44:34 compute-0 sudo[105773]: pam_unix(sudo:session): session closed for user root
Oct 01 13:44:34 compute-0 sudo[105926]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhxfqjiadxssqjyzcramlmgwzlytndia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326274.326185-212-120718823842340/AnsiballZ_file.py'
Oct 01 13:44:34 compute-0 sudo[105926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:44:35 compute-0 python3.9[105928]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:44:35 compute-0 sudo[105926]: pam_unix(sudo:session): session closed for user root
Oct 01 13:44:35 compute-0 sudo[106078]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-weouccidzfvzpclddgzdrcxmirdiwvju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326275.2147439-212-125480640836185/AnsiballZ_file.py'
Oct 01 13:44:35 compute-0 sudo[106078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:44:35 compute-0 python3.9[106080]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:44:35 compute-0 sudo[106078]: pam_unix(sudo:session): session closed for user root
Oct 01 13:44:36 compute-0 sudo[106230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgomsjxgxzojerzeytaesvhdaxplvrin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326276.0128434-212-266945392183961/AnsiballZ_file.py'
Oct 01 13:44:36 compute-0 sudo[106230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:44:36 compute-0 python3.9[106232]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:44:36 compute-0 sudo[106230]: pam_unix(sudo:session): session closed for user root
Oct 01 13:44:37 compute-0 sudo[106382]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mymiyisfaskdpgweiaogxyzddbxvqrny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326276.7939844-212-74881049916404/AnsiballZ_file.py'
Oct 01 13:44:37 compute-0 sudo[106382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:44:37 compute-0 python3.9[106384]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:44:37 compute-0 sudo[106382]: pam_unix(sudo:session): session closed for user root
Oct 01 13:44:37 compute-0 sudo[106534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxrfhinzugulkklrgluiozgcgaxfinvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326277.663369-212-229949027218802/AnsiballZ_file.py'
Oct 01 13:44:37 compute-0 sudo[106534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:44:38 compute-0 python3.9[106536]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:44:38 compute-0 sudo[106534]: pam_unix(sudo:session): session closed for user root
Oct 01 13:44:38 compute-0 sudo[106686]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-juifbrdeqaedwckzqidbvucikztgaoef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326278.3497958-212-257819953109968/AnsiballZ_file.py'
Oct 01 13:44:38 compute-0 sudo[106686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:44:38 compute-0 python3.9[106688]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:44:38 compute-0 sudo[106686]: pam_unix(sudo:session): session closed for user root
Oct 01 13:44:39 compute-0 sudo[106838]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mgfwkmwtbmxtghmclydjexlbgijflxir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326279.0307746-212-44601113668493/AnsiballZ_file.py'
Oct 01 13:44:39 compute-0 sudo[106838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:44:39 compute-0 python3.9[106840]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:44:39 compute-0 sudo[106838]: pam_unix(sudo:session): session closed for user root
Oct 01 13:44:40 compute-0 sudo[106990]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-krxhtajtzpuhpsmqtzplcqnholxbtdmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326279.9129596-312-204889181784035/AnsiballZ_file.py'
Oct 01 13:44:40 compute-0 sudo[106990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:44:40 compute-0 python3.9[106992]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:44:40 compute-0 sudo[106990]: pam_unix(sudo:session): session closed for user root
Oct 01 13:44:40 compute-0 sudo[107142]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gckrvztkrjynqshzokhstinseudxctfz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326280.6339529-312-30164377085008/AnsiballZ_file.py'
Oct 01 13:44:40 compute-0 sudo[107142]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:44:41 compute-0 python3.9[107144]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:44:41 compute-0 sudo[107142]: pam_unix(sudo:session): session closed for user root
Oct 01 13:44:41 compute-0 sudo[107294]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhyjyivnfckdbbenudyxnjmshmkuxlst ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326281.3408446-312-58749726043141/AnsiballZ_file.py'
Oct 01 13:44:41 compute-0 sudo[107294]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:44:41 compute-0 python3.9[107296]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:44:41 compute-0 sudo[107294]: pam_unix(sudo:session): session closed for user root
Oct 01 13:44:42 compute-0 sudo[107459]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilsjodqilxbjbblmragpnbnxsloxsqax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326282.0093696-312-98533138368842/AnsiballZ_file.py'
Oct 01 13:44:42 compute-0 sudo[107459]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:44:42 compute-0 podman[107420]: 2025-10-01 13:44:42.511802953 +0000 UTC m=+0.147205406 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct 01 13:44:42 compute-0 podman[107465]: 2025-10-01 13:44:42.515520249 +0000 UTC m=+0.057907485 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, container_name=ovn_metadata_agent)
Oct 01 13:44:42 compute-0 python3.9[107468]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:44:42 compute-0 sudo[107459]: pam_unix(sudo:session): session closed for user root
Oct 01 13:44:43 compute-0 sudo[107642]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfoevpikeizvwptsmmqbntgsawglyxyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326282.8277936-312-184404493685525/AnsiballZ_file.py'
Oct 01 13:44:43 compute-0 sudo[107642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:44:43 compute-0 python3.9[107644]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:44:43 compute-0 sudo[107642]: pam_unix(sudo:session): session closed for user root
Oct 01 13:44:43 compute-0 sudo[107794]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uuqwxdcfuapmurindvsbmsjuzxycpsas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326283.4177947-312-104984869987242/AnsiballZ_file.py'
Oct 01 13:44:43 compute-0 sudo[107794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:44:43 compute-0 python3.9[107796]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:44:43 compute-0 sudo[107794]: pam_unix(sudo:session): session closed for user root
Oct 01 13:44:44 compute-0 sudo[107946]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdokkzsuajyieshbxdvynnffiugukare ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326284.1084514-312-37557679006283/AnsiballZ_file.py'
Oct 01 13:44:44 compute-0 sudo[107946]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:44:44 compute-0 python3.9[107948]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:44:44 compute-0 sudo[107946]: pam_unix(sudo:session): session closed for user root
Oct 01 13:44:45 compute-0 sudo[108098]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eezmmedjqqbgvyraajmvarngqgvdfzip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326284.9256966-414-233889538752932/AnsiballZ_command.py'
Oct 01 13:44:45 compute-0 sudo[108098]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:44:45 compute-0 python3.9[108100]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:44:45 compute-0 sudo[108098]: pam_unix(sudo:session): session closed for user root
Oct 01 13:44:46 compute-0 python3.9[108252]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct 01 13:44:46 compute-0 sudo[108402]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nyrufcjqldudmgpkuldexipqjqgcbzdo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326286.6298897-450-219824389567986/AnsiballZ_systemd_service.py'
Oct 01 13:44:46 compute-0 sudo[108402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:44:47 compute-0 python3.9[108404]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 01 13:44:47 compute-0 systemd[1]: Reloading.
Oct 01 13:44:47 compute-0 systemd-sysv-generator[108433]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:44:47 compute-0 systemd-rc-local-generator[108429]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:44:47 compute-0 sudo[108402]: pam_unix(sudo:session): session closed for user root
Oct 01 13:44:48 compute-0 sudo[108588]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ooyphgqtjmgruoropnnghqzndymzwxei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326287.713911-466-134051852100131/AnsiballZ_command.py'
Oct 01 13:44:48 compute-0 sudo[108588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:44:48 compute-0 python3.9[108590]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:44:48 compute-0 sudo[108588]: pam_unix(sudo:session): session closed for user root
Oct 01 13:44:48 compute-0 sudo[108741]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgclbbgbjihixqzxychbzrzvuvbldfcc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326288.4949718-466-268516276109195/AnsiballZ_command.py'
Oct 01 13:44:48 compute-0 sudo[108741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:44:49 compute-0 python3.9[108743]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:44:49 compute-0 sudo[108741]: pam_unix(sudo:session): session closed for user root
Oct 01 13:44:49 compute-0 sudo[108894]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-obpmhwzqfvkjldfkyyomphekdltkuszq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326289.2729108-466-13234575783475/AnsiballZ_command.py'
Oct 01 13:44:49 compute-0 sudo[108894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:44:49 compute-0 python3.9[108896]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:44:49 compute-0 sudo[108894]: pam_unix(sudo:session): session closed for user root
Oct 01 13:44:50 compute-0 sudo[109047]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grlnajnioxctyqaezzjrgmopszmqfjjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326290.05506-466-184466745201822/AnsiballZ_command.py'
Oct 01 13:44:50 compute-0 sudo[109047]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:44:50 compute-0 python3.9[109049]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:44:50 compute-0 sudo[109047]: pam_unix(sudo:session): session closed for user root
Oct 01 13:44:51 compute-0 sudo[109200]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzznpywmcrpwkfsfzpdoacnlrnhrosxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326290.8332846-466-94719096606876/AnsiballZ_command.py'
Oct 01 13:44:51 compute-0 sudo[109200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:44:51 compute-0 python3.9[109202]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:44:51 compute-0 sudo[109200]: pam_unix(sudo:session): session closed for user root
Oct 01 13:44:51 compute-0 sudo[109353]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxghsiwnkvrytupkunlzkqoezoszwbds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326291.6295927-466-111261111295393/AnsiballZ_command.py'
Oct 01 13:44:51 compute-0 sudo[109353]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:44:52 compute-0 python3.9[109355]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:44:52 compute-0 sudo[109353]: pam_unix(sudo:session): session closed for user root
Oct 01 13:44:52 compute-0 sudo[109506]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqtlcrhcyuaorfjmzyqmprmdwjclgcwn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326292.3903806-466-74953246999960/AnsiballZ_command.py'
Oct 01 13:44:52 compute-0 sudo[109506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:44:52 compute-0 python3.9[109508]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:44:52 compute-0 sudo[109506]: pam_unix(sudo:session): session closed for user root
Oct 01 13:44:53 compute-0 sudo[109659]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfkfuyficixkswylxwtdeexdtwucyhgp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326293.4469664-574-191123649399849/AnsiballZ_getent.py'
Oct 01 13:44:54 compute-0 sudo[109659]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:44:54 compute-0 python3.9[109661]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Oct 01 13:44:55 compute-0 sudo[109659]: pam_unix(sudo:session): session closed for user root
Oct 01 13:44:56 compute-0 sudo[109812]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmprpbbakhictgyixvvegmbqhpmuujuw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326295.4866905-590-268948111495674/AnsiballZ_group.py'
Oct 01 13:44:56 compute-0 sudo[109812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:44:56 compute-0 python3.9[109814]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct 01 13:44:56 compute-0 groupadd[109815]: group added to /etc/group: name=libvirt, GID=42473
Oct 01 13:44:56 compute-0 groupadd[109815]: group added to /etc/gshadow: name=libvirt
Oct 01 13:44:56 compute-0 groupadd[109815]: new group: name=libvirt, GID=42473
Oct 01 13:44:56 compute-0 sudo[109812]: pam_unix(sudo:session): session closed for user root
Oct 01 13:44:57 compute-0 sudo[109970]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uibgobmkalgnswtygkhsdsulxtfhijcq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326296.66808-606-265846546546406/AnsiballZ_user.py'
Oct 01 13:44:57 compute-0 sudo[109970]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:44:57 compute-0 python3.9[109972]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct 01 13:44:57 compute-0 useradd[109974]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/0
Oct 01 13:44:57 compute-0 sudo[109970]: pam_unix(sudo:session): session closed for user root
Oct 01 13:44:58 compute-0 sudo[110130]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ydrbqdmvnrjculfqmikyjdnvfoknuztw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326297.9717853-628-210972649106218/AnsiballZ_setup.py'
Oct 01 13:44:58 compute-0 sudo[110130]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:44:58 compute-0 python3.9[110132]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 01 13:44:58 compute-0 sudo[110130]: pam_unix(sudo:session): session closed for user root
Oct 01 13:44:59 compute-0 sudo[110214]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xufnjxjsodayqktfemnpuyihjpiuqqfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326297.9717853-628-210972649106218/AnsiballZ_dnf.py'
Oct 01 13:44:59 compute-0 sudo[110214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:44:59 compute-0 python3.9[110216]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 01 13:45:13 compute-0 podman[110401]: 2025-10-01 13:45:13.184048948 +0000 UTC m=+0.091270373 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 01 13:45:13 compute-0 podman[110402]: 2025-10-01 13:45:13.227990078 +0000 UTC m=+0.133889697 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Oct 01 13:45:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:45:14.174 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 13:45:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:45:14.175 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 13:45:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:45:14.175 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 13:45:27 compute-0 kernel: SELinux:  Converting 2754 SID table entries...
Oct 01 13:45:27 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Oct 01 13:45:27 compute-0 kernel: SELinux:  policy capability open_perms=1
Oct 01 13:45:27 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Oct 01 13:45:27 compute-0 kernel: SELinux:  policy capability always_check_network=0
Oct 01 13:45:27 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 01 13:45:27 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 01 13:45:27 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 01 13:45:36 compute-0 kernel: SELinux:  Converting 2754 SID table entries...
Oct 01 13:45:36 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Oct 01 13:45:36 compute-0 kernel: SELinux:  policy capability open_perms=1
Oct 01 13:45:36 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Oct 01 13:45:36 compute-0 kernel: SELinux:  policy capability always_check_network=0
Oct 01 13:45:36 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 01 13:45:36 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 01 13:45:36 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 01 13:45:44 compute-0 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Oct 01 13:45:44 compute-0 podman[110467]: 2025-10-01 13:45:44.180696665 +0000 UTC m=+0.080888042 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Oct 01 13:45:44 compute-0 podman[110468]: 2025-10-01 13:45:44.216927326 +0000 UTC m=+0.112750745 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true)
Oct 01 13:46:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:46:14.176 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 13:46:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:46:14.176 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 13:46:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:46:14.176 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 13:46:15 compute-0 podman[123781]: 2025-10-01 13:46:15.154208464 +0000 UTC m=+0.074053050 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, io.buildah.version=1.41.4)
Oct 01 13:46:15 compute-0 podman[123787]: 2025-10-01 13:46:15.185741224 +0000 UTC m=+0.103107960 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Oct 01 13:46:34 compute-0 kernel: SELinux:  Converting 2755 SID table entries...
Oct 01 13:46:34 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Oct 01 13:46:34 compute-0 kernel: SELinux:  policy capability open_perms=1
Oct 01 13:46:34 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Oct 01 13:46:34 compute-0 kernel: SELinux:  policy capability always_check_network=0
Oct 01 13:46:34 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct 01 13:46:34 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct 01 13:46:34 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct 01 13:46:35 compute-0 groupadd[127314]: group added to /etc/group: name=dnsmasq, GID=992
Oct 01 13:46:35 compute-0 groupadd[127314]: group added to /etc/gshadow: name=dnsmasq
Oct 01 13:46:35 compute-0 groupadd[127314]: new group: name=dnsmasq, GID=992
Oct 01 13:46:35 compute-0 useradd[127321]: new user: name=dnsmasq, UID=992, GID=992, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Oct 01 13:46:35 compute-0 dbus-broker-launch[748]: Noticed file-system modification, trigger reload.
Oct 01 13:46:35 compute-0 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Oct 01 13:46:35 compute-0 dbus-broker-launch[748]: Noticed file-system modification, trigger reload.
Oct 01 13:46:36 compute-0 groupadd[127334]: group added to /etc/group: name=clevis, GID=991
Oct 01 13:46:36 compute-0 groupadd[127334]: group added to /etc/gshadow: name=clevis
Oct 01 13:46:36 compute-0 groupadd[127334]: new group: name=clevis, GID=991
Oct 01 13:46:36 compute-0 useradd[127341]: new user: name=clevis, UID=991, GID=991, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Oct 01 13:46:37 compute-0 usermod[127351]: add 'clevis' to group 'tss'
Oct 01 13:46:37 compute-0 usermod[127351]: add 'clevis' to shadow group 'tss'
Oct 01 13:46:39 compute-0 polkitd[6977]: Reloading rules
Oct 01 13:46:39 compute-0 polkitd[6977]: Collecting garbage unconditionally...
Oct 01 13:46:39 compute-0 polkitd[6977]: Loading rules from directory /etc/polkit-1/rules.d
Oct 01 13:46:39 compute-0 polkitd[6977]: Loading rules from directory /usr/share/polkit-1/rules.d
Oct 01 13:46:39 compute-0 polkitd[6977]: Finished loading, compiling and executing 4 rules
Oct 01 13:46:39 compute-0 polkitd[6977]: Reloading rules
Oct 01 13:46:39 compute-0 polkitd[6977]: Collecting garbage unconditionally...
Oct 01 13:46:39 compute-0 polkitd[6977]: Loading rules from directory /etc/polkit-1/rules.d
Oct 01 13:46:39 compute-0 polkitd[6977]: Loading rules from directory /usr/share/polkit-1/rules.d
Oct 01 13:46:39 compute-0 polkitd[6977]: Finished loading, compiling and executing 4 rules
Oct 01 13:46:41 compute-0 groupadd[127538]: group added to /etc/group: name=ceph, GID=167
Oct 01 13:46:41 compute-0 groupadd[127538]: group added to /etc/gshadow: name=ceph
Oct 01 13:46:41 compute-0 groupadd[127538]: new group: name=ceph, GID=167
Oct 01 13:46:41 compute-0 useradd[127544]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Oct 01 13:46:45 compute-0 systemd[1]: Stopping OpenSSH server daemon...
Oct 01 13:46:45 compute-0 sshd[1007]: Received signal 15; terminating.
Oct 01 13:46:45 compute-0 systemd[1]: sshd.service: Deactivated successfully.
Oct 01 13:46:45 compute-0 systemd[1]: Stopped OpenSSH server daemon.
Oct 01 13:46:45 compute-0 systemd[1]: sshd.service: Consumed 1.832s CPU time, read 0B from disk, written 8.0K to disk.
Oct 01 13:46:45 compute-0 systemd[1]: Stopped target sshd-keygen.target.
Oct 01 13:46:45 compute-0 systemd[1]: Stopping sshd-keygen.target...
Oct 01 13:46:45 compute-0 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct 01 13:46:45 compute-0 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct 01 13:46:45 compute-0 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct 01 13:46:45 compute-0 systemd[1]: Reached target sshd-keygen.target.
Oct 01 13:46:45 compute-0 systemd[1]: Starting OpenSSH server daemon...
Oct 01 13:46:45 compute-0 sshd[128084]: Server listening on 0.0.0.0 port 22.
Oct 01 13:46:45 compute-0 sshd[128084]: Server listening on :: port 22.
Oct 01 13:46:45 compute-0 systemd[1]: Started OpenSSH server daemon.
Oct 01 13:46:45 compute-0 podman[128061]: 2025-10-01 13:46:45.388109207 +0000 UTC m=+0.107262246 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 01 13:46:45 compute-0 podman[128062]: 2025-10-01 13:46:45.445447099 +0000 UTC m=+0.164008702 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Oct 01 13:46:48 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 01 13:46:48 compute-0 systemd[1]: Starting man-db-cache-update.service...
Oct 01 13:46:48 compute-0 systemd[1]: Reloading.
Oct 01 13:46:48 compute-0 systemd-sysv-generator[128364]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:46:48 compute-0 systemd-rc-local-generator[128360]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:46:48 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Oct 01 13:46:50 compute-0 systemd[1]: Starting PackageKit Daemon...
Oct 01 13:46:50 compute-0 PackageKit[130220]: daemon start
Oct 01 13:46:51 compute-0 systemd[1]: Started PackageKit Daemon.
Oct 01 13:46:51 compute-0 sudo[110214]: pam_unix(sudo:session): session closed for user root
Oct 01 13:46:52 compute-0 sudo[131741]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-exrvquaqazjsljtvghxzylmyhimrvuqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326411.6706147-652-105667448802624/AnsiballZ_systemd.py'
Oct 01 13:46:52 compute-0 sudo[131741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:46:52 compute-0 python3.9[131774]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 01 13:46:52 compute-0 systemd[1]: Reloading.
Oct 01 13:46:52 compute-0 systemd-rc-local-generator[132161]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:46:52 compute-0 systemd-sysv-generator[132166]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:46:53 compute-0 sudo[131741]: pam_unix(sudo:session): session closed for user root
Oct 01 13:46:53 compute-0 sudo[132846]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wiqbndhziduzgcdrpipyzaznsfmzucxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326413.2920563-652-116915630620774/AnsiballZ_systemd.py'
Oct 01 13:46:53 compute-0 sudo[132846]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:46:53 compute-0 python3.9[132865]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 01 13:46:54 compute-0 systemd[1]: Reloading.
Oct 01 13:46:54 compute-0 systemd-rc-local-generator[133248]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:46:54 compute-0 systemd-sysv-generator[133252]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:46:54 compute-0 sudo[132846]: pam_unix(sudo:session): session closed for user root
Oct 01 13:46:54 compute-0 sudo[133977]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqwcrauvlowsdkjmxdvtnvzogxlqoesm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326414.489829-652-199356730670776/AnsiballZ_systemd.py'
Oct 01 13:46:54 compute-0 sudo[133977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:46:54 compute-0 unix_chkpwd[134072]: password check failed for user (root)
Oct 01 13:46:54 compute-0 sshd-session[133306]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.244  user=root
Oct 01 13:46:55 compute-0 python3.9[133994]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 01 13:46:55 compute-0 systemd[1]: Reloading.
Oct 01 13:46:55 compute-0 systemd-sysv-generator[134392]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:46:55 compute-0 systemd-rc-local-generator[134388]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:46:55 compute-0 sudo[133977]: pam_unix(sudo:session): session closed for user root
Oct 01 13:46:56 compute-0 sudo[135052]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzcewvsvkjbgxpcszwebkgeuiclnznli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326415.6768632-652-240967965099676/AnsiballZ_systemd.py'
Oct 01 13:46:56 compute-0 sudo[135052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:46:56 compute-0 python3.9[135070]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 01 13:46:56 compute-0 sshd-session[133306]: Failed password for root from 193.46.255.244 port 25872 ssh2
Oct 01 13:46:57 compute-0 systemd[1]: Reloading.
Oct 01 13:46:57 compute-0 systemd-rc-local-generator[136405]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:46:57 compute-0 systemd-sysv-generator[136408]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:46:57 compute-0 sshd-session[135375]: Invalid user Administrator from 80.94.95.116 port 31708
Oct 01 13:46:57 compute-0 sudo[135052]: pam_unix(sudo:session): session closed for user root
Oct 01 13:46:57 compute-0 unix_chkpwd[136571]: password check failed for user (root)
Oct 01 13:46:57 compute-0 sshd-session[135375]: Failed none for invalid user Administrator from 80.94.95.116 port 31708 ssh2
Oct 01 13:46:57 compute-0 sshd-session[135375]: Connection closed by invalid user Administrator 80.94.95.116 port 31708 [preauth]
Oct 01 13:46:58 compute-0 sudo[137067]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qatuckiwnjssovakxrmvbthblxkiumoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326417.956599-710-43944813918803/AnsiballZ_systemd.py'
Oct 01 13:46:58 compute-0 sudo[137067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:46:58 compute-0 python3.9[137095]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 01 13:46:58 compute-0 systemd[1]: Reloading.
Oct 01 13:46:58 compute-0 systemd-rc-local-generator[137453]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:46:58 compute-0 systemd-sysv-generator[137456]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:46:59 compute-0 sudo[137067]: pam_unix(sudo:session): session closed for user root
Oct 01 13:46:59 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct 01 13:46:59 compute-0 systemd[1]: Finished man-db-cache-update.service.
Oct 01 13:46:59 compute-0 systemd[1]: man-db-cache-update.service: Consumed 14.005s CPU time.
Oct 01 13:46:59 compute-0 systemd[1]: run-re51a269758d24585b7d82e4cff2b1031.service: Deactivated successfully.
Oct 01 13:46:59 compute-0 sudo[137710]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdhsxqvpocfwxpyrfgqrpgqysubpfxvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326419.2464368-710-168885726842430/AnsiballZ_systemd.py'
Oct 01 13:46:59 compute-0 sudo[137710]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:46:59 compute-0 sshd-session[133306]: Failed password for root from 193.46.255.244 port 25872 ssh2
Oct 01 13:46:59 compute-0 python3.9[137712]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 01 13:46:59 compute-0 systemd[1]: Reloading.
Oct 01 13:47:00 compute-0 systemd-rc-local-generator[137741]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:47:00 compute-0 systemd-sysv-generator[137747]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:47:00 compute-0 sudo[137710]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:00 compute-0 unix_chkpwd[137833]: password check failed for user (root)
Oct 01 13:47:00 compute-0 sudo[137902]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztwbetpvjjnbbcupmlybqlqqrirpqimc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326420.435089-710-121916120036706/AnsiballZ_systemd.py'
Oct 01 13:47:00 compute-0 sudo[137902]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:01 compute-0 python3.9[137904]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 01 13:47:01 compute-0 systemd[1]: Reloading.
Oct 01 13:47:01 compute-0 systemd-sysv-generator[137940]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:47:01 compute-0 systemd-rc-local-generator[137937]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:47:01 compute-0 sudo[137902]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:01 compute-0 sudo[138092]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-snaddtrqgeuwrpksbvchjeqpqyyuavwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326421.6186726-710-35969686400254/AnsiballZ_systemd.py'
Oct 01 13:47:01 compute-0 sudo[138092]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:02 compute-0 python3.9[138094]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 01 13:47:03 compute-0 sshd-session[133306]: Failed password for root from 193.46.255.244 port 25872 ssh2
Oct 01 13:47:03 compute-0 sudo[138092]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:03 compute-0 sudo[138247]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rtfvcebpuotvgymijrwpnilxftrgctdn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326423.60873-710-50476000333053/AnsiballZ_systemd.py'
Oct 01 13:47:03 compute-0 sudo[138247]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:03 compute-0 sshd-session[133306]: Received disconnect from 193.46.255.244 port 25872:11:  [preauth]
Oct 01 13:47:03 compute-0 sshd-session[133306]: Disconnected from authenticating user root 193.46.255.244 port 25872 [preauth]
Oct 01 13:47:03 compute-0 sshd-session[133306]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.244  user=root
Oct 01 13:47:04 compute-0 python3.9[138249]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 01 13:47:04 compute-0 systemd[1]: Reloading.
Oct 01 13:47:04 compute-0 systemd-rc-local-generator[138277]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:47:04 compute-0 systemd-sysv-generator[138284]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:47:04 compute-0 sudo[138247]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:05 compute-0 unix_chkpwd[138413]: password check failed for user (root)
Oct 01 13:47:05 compute-0 sshd-session[138255]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.244  user=root
Oct 01 13:47:05 compute-0 sudo[138440]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikubctdworxmwvwjntnjjczexltzigrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326424.967121-782-11275496165629/AnsiballZ_systemd.py'
Oct 01 13:47:05 compute-0 sudo[138440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:05 compute-0 python3.9[138442]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct 01 13:47:05 compute-0 systemd[1]: Reloading.
Oct 01 13:47:05 compute-0 systemd-rc-local-generator[138469]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:47:05 compute-0 systemd-sysv-generator[138473]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:47:06 compute-0 systemd[1]: Listening on libvirt proxy daemon socket.
Oct 01 13:47:06 compute-0 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Oct 01 13:47:06 compute-0 sudo[138440]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:06 compute-0 sudo[138633]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-igriqdvhjdwrlyfmnkctrzkogxhlxjum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326426.3215632-798-234311712893262/AnsiballZ_systemd.py'
Oct 01 13:47:06 compute-0 sudo[138633]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:06 compute-0 python3.9[138635]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 01 13:47:07 compute-0 sudo[138633]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:07 compute-0 sshd-session[138255]: Failed password for root from 193.46.255.244 port 21762 ssh2
Oct 01 13:47:07 compute-0 sudo[138788]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxlepjxtbybtshsetqmixazxhtpgeych ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326427.2686734-798-222118249281908/AnsiballZ_systemd.py'
Oct 01 13:47:07 compute-0 sudo[138788]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:08 compute-0 unix_chkpwd[138791]: password check failed for user (root)
Oct 01 13:47:08 compute-0 python3.9[138790]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 01 13:47:08 compute-0 sudo[138788]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:08 compute-0 sudo[138944]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhulsnuilgcqkscpzkzdpazspzxhwdgu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326428.354986-798-185106853617323/AnsiballZ_systemd.py'
Oct 01 13:47:08 compute-0 sudo[138944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:09 compute-0 python3.9[138946]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 01 13:47:09 compute-0 sudo[138944]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:09 compute-0 sudo[139099]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pbjkaagvlktmzgfbqztbtainouqhkbpg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326429.3382778-798-253235028751322/AnsiballZ_systemd.py'
Oct 01 13:47:09 compute-0 sudo[139099]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:10 compute-0 python3.9[139101]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 01 13:47:10 compute-0 sudo[139099]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:10 compute-0 sshd-session[138255]: Failed password for root from 193.46.255.244 port 21762 ssh2
Oct 01 13:47:10 compute-0 sudo[139254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umkptkytdvkfcqyiiihoqbfvwluescbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326430.3258386-798-163049803252956/AnsiballZ_systemd.py'
Oct 01 13:47:10 compute-0 sudo[139254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:10 compute-0 unix_chkpwd[139257]: password check failed for user (root)
Oct 01 13:47:11 compute-0 python3.9[139256]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 01 13:47:11 compute-0 sudo[139254]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:11 compute-0 sudo[139411]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brcdrvtfnjceglbxpzojkrkqvpsvfxwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326431.3589685-798-204494173911243/AnsiballZ_systemd.py'
Oct 01 13:47:11 compute-0 sudo[139411]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:12 compute-0 python3.9[139413]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 01 13:47:12 compute-0 sudo[139411]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:12 compute-0 sshd-session[138255]: Failed password for root from 193.46.255.244 port 21762 ssh2
Oct 01 13:47:12 compute-0 sudo[139566]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ufawiuvtfhglfqfeuivwswgzjvbesedu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326432.3522823-798-181246707561370/AnsiballZ_systemd.py'
Oct 01 13:47:12 compute-0 sudo[139566]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:13 compute-0 python3.9[139568]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 01 13:47:13 compute-0 sudo[139566]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:13 compute-0 sshd-session[138255]: Received disconnect from 193.46.255.244 port 21762:11:  [preauth]
Oct 01 13:47:13 compute-0 sshd-session[138255]: Disconnected from authenticating user root 193.46.255.244 port 21762 [preauth]
Oct 01 13:47:13 compute-0 sshd-session[138255]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.244  user=root
Oct 01 13:47:13 compute-0 sudo[139723]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oefshbazhmeutdxilsjqbutjlxwmcfgj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326433.5296779-798-189936890639332/AnsiballZ_systemd.py'
Oct 01 13:47:13 compute-0 sudo[139723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:47:14.177 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 13:47:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:47:14.179 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 13:47:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:47:14.179 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 13:47:14 compute-0 python3.9[139725]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 01 13:47:14 compute-0 unix_chkpwd[139729]: password check failed for user (root)
Oct 01 13:47:14 compute-0 sshd-session[139671]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.244  user=root
Oct 01 13:47:15 compute-0 sudo[139723]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:16 compute-0 sudo[139901]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-snlgaqkmoalrncmmsmzcbtaalrzapevi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326435.559033-798-191767816483830/AnsiballZ_systemd.py'
Oct 01 13:47:16 compute-0 sudo[139901]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:16 compute-0 podman[139854]: 2025-10-01 13:47:16.032383181 +0000 UTC m=+0.100531245 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct 01 13:47:16 compute-0 podman[139855]: 2025-10-01 13:47:16.092725925 +0000 UTC m=+0.159553313 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Oct 01 13:47:16 compute-0 sshd-session[139671]: Failed password for root from 193.46.255.244 port 45852 ssh2
Oct 01 13:47:16 compute-0 python3.9[139912]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 01 13:47:16 compute-0 sudo[139901]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:16 compute-0 sudo[140077]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cuqilhniuihkbzexhiwpoaecezexkdzv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326436.6333134-798-201258624849715/AnsiballZ_systemd.py'
Oct 01 13:47:16 compute-0 sudo[140077]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:17 compute-0 unix_chkpwd[140080]: password check failed for user (root)
Oct 01 13:47:17 compute-0 python3.9[140079]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 01 13:47:17 compute-0 sudo[140077]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:17 compute-0 sudo[140233]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhmuhkhftkuaqpstgjtqegiqawudvztb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326437.5789719-798-157767741963016/AnsiballZ_systemd.py'
Oct 01 13:47:17 compute-0 sudo[140233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:18 compute-0 python3.9[140235]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 01 13:47:18 compute-0 sudo[140233]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:19 compute-0 sudo[140388]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zedpyhifixzjqoulqduvxogtriavyytj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326438.598016-798-131514305488985/AnsiballZ_systemd.py'
Oct 01 13:47:19 compute-0 sudo[140388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:19 compute-0 sshd-session[139671]: Failed password for root from 193.46.255.244 port 45852 ssh2
Oct 01 13:47:19 compute-0 python3.9[140390]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 01 13:47:19 compute-0 sudo[140388]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:20 compute-0 sudo[140543]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-juoqgyxqcjxxgrprweifwqwzxsowtgqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326439.7250488-798-124854307133744/AnsiballZ_systemd.py'
Oct 01 13:47:20 compute-0 sudo[140543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:20 compute-0 unix_chkpwd[140545]: password check failed for user (root)
Oct 01 13:47:20 compute-0 python3.9[140546]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 01 13:47:20 compute-0 sudo[140543]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:21 compute-0 sudo[140699]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mqdpslzhurzrgekveyigjqpjfxpbzjdy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326440.7185442-798-147996499433164/AnsiballZ_systemd.py'
Oct 01 13:47:21 compute-0 sudo[140699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:21 compute-0 python3.9[140701]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct 01 13:47:21 compute-0 sudo[140699]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:22 compute-0 sshd-session[139671]: Failed password for root from 193.46.255.244 port 45852 ssh2
Oct 01 13:47:22 compute-0 sudo[140854]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-csuljvtfpufkwrryvyygudlarkelmxdw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326442.0202472-1002-106362752702111/AnsiballZ_file.py'
Oct 01 13:47:22 compute-0 sudo[140854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:22 compute-0 python3.9[140856]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:47:22 compute-0 sudo[140854]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:22 compute-0 sshd-session[139671]: Received disconnect from 193.46.255.244 port 45852:11:  [preauth]
Oct 01 13:47:22 compute-0 sshd-session[139671]: Disconnected from authenticating user root 193.46.255.244 port 45852 [preauth]
Oct 01 13:47:22 compute-0 sshd-session[139671]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.244  user=root
Oct 01 13:47:23 compute-0 sudo[141006]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwzjrpltzwhklluyvxkfawpvxypjdbod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326442.8348887-1002-267730887596501/AnsiballZ_file.py'
Oct 01 13:47:23 compute-0 sudo[141006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:23 compute-0 python3.9[141008]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:47:23 compute-0 sudo[141006]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:23 compute-0 sudo[141158]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xctluferblxxqzmzdoppxpjkqytqhjkx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326443.6018004-1002-140154432957959/AnsiballZ_file.py'
Oct 01 13:47:23 compute-0 sudo[141158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:24 compute-0 python3.9[141160]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:47:24 compute-0 sudo[141158]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:24 compute-0 sudo[141310]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-inhcemswgvnatmdbmksmttgpqabuoorm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326444.4151103-1002-117778351375081/AnsiballZ_file.py'
Oct 01 13:47:24 compute-0 sudo[141310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:25 compute-0 python3.9[141312]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:47:25 compute-0 sudo[141310]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:25 compute-0 sudo[141462]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ybfiipndmcprwvibfurxgpekjtaixfwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326445.2337866-1002-105562003752485/AnsiballZ_file.py'
Oct 01 13:47:25 compute-0 sudo[141462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:25 compute-0 python3.9[141464]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:47:25 compute-0 sudo[141462]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:26 compute-0 sudo[141614]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izsjhsjsidtvcukwkzlkcntewklxgylm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326446.0233698-1002-225386025569859/AnsiballZ_file.py'
Oct 01 13:47:26 compute-0 sudo[141614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:26 compute-0 python3.9[141616]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:47:26 compute-0 sudo[141614]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:27 compute-0 sudo[141766]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-krlkblcyogxnclljrbdyarftyveiqhcp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326446.767516-1088-131120968016869/AnsiballZ_stat.py'
Oct 01 13:47:27 compute-0 sudo[141766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:27 compute-0 python3.9[141768]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:47:27 compute-0 sudo[141766]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:28 compute-0 sudo[141891]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-knywixfcddhznmzeabiycklrzixejvvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326446.767516-1088-131120968016869/AnsiballZ_copy.py'
Oct 01 13:47:28 compute-0 sudo[141891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:28 compute-0 python3.9[141893]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759326446.767516-1088-131120968016869/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:47:28 compute-0 sudo[141891]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:28 compute-0 sudo[142043]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbxddnhzdwvdttbfnaxmchfsawddellv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326448.416631-1088-48462131887457/AnsiballZ_stat.py'
Oct 01 13:47:28 compute-0 sudo[142043]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:29 compute-0 python3.9[142045]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:47:29 compute-0 sudo[142043]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:29 compute-0 sudo[142168]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccybdreokcmwjanmojkqnydhidixecjo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326448.416631-1088-48462131887457/AnsiballZ_copy.py'
Oct 01 13:47:29 compute-0 sudo[142168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:29 compute-0 python3.9[142170]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759326448.416631-1088-48462131887457/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:47:29 compute-0 sudo[142168]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:30 compute-0 sudo[142320]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqflwobzcqprmdeevmahnagkgmylsxoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326449.9087934-1088-237811663319063/AnsiballZ_stat.py'
Oct 01 13:47:30 compute-0 sudo[142320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:30 compute-0 python3.9[142322]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:47:30 compute-0 sudo[142320]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:30 compute-0 sudo[142445]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txbwnmjdsworfbtdmzqfagzjwiofslud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326449.9087934-1088-237811663319063/AnsiballZ_copy.py'
Oct 01 13:47:30 compute-0 sudo[142445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:31 compute-0 python3.9[142447]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759326449.9087934-1088-237811663319063/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:47:31 compute-0 sudo[142445]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:31 compute-0 sudo[142597]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tnhbharhbfzoyzjkgqamulyohdwpotzi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326451.382633-1088-162180247776451/AnsiballZ_stat.py'
Oct 01 13:47:31 compute-0 sudo[142597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:31 compute-0 python3.9[142599]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:47:31 compute-0 sudo[142597]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:32 compute-0 sudo[142722]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dcgayksrhlkqqrccprllkaisljdfzdov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326451.382633-1088-162180247776451/AnsiballZ_copy.py'
Oct 01 13:47:32 compute-0 sudo[142722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:32 compute-0 python3.9[142724]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759326451.382633-1088-162180247776451/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:47:32 compute-0 sudo[142722]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:33 compute-0 sudo[142874]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwjoknfcljdyjjyvgbnjshpyktopfxuw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326452.8054283-1088-246087170447132/AnsiballZ_stat.py'
Oct 01 13:47:33 compute-0 sudo[142874]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:33 compute-0 python3.9[142876]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:47:33 compute-0 sudo[142874]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:33 compute-0 sudo[142999]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytkmsxnxkksccttrpvfkjumuksequpub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326452.8054283-1088-246087170447132/AnsiballZ_copy.py'
Oct 01 13:47:33 compute-0 sudo[142999]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:34 compute-0 python3.9[143001]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759326452.8054283-1088-246087170447132/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:47:34 compute-0 sudo[142999]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:34 compute-0 sudo[143151]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifbifxvkkfqzlmmbrmckestraztnxjuj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326454.2230592-1088-170645545777021/AnsiballZ_stat.py'
Oct 01 13:47:34 compute-0 sudo[143151]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:34 compute-0 python3.9[143153]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:47:34 compute-0 sudo[143151]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:35 compute-0 sudo[143276]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ckvilgttdpraelkozwrnqpykrouiigse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326454.2230592-1088-170645545777021/AnsiballZ_copy.py'
Oct 01 13:47:35 compute-0 sudo[143276]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:35 compute-0 python3.9[143278]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759326454.2230592-1088-170645545777021/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:47:35 compute-0 sudo[143276]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:36 compute-0 sudo[143428]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lnjkqnqagibybtdgisgmhujleksjnses ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326455.695848-1088-15622843269414/AnsiballZ_stat.py'
Oct 01 13:47:36 compute-0 sudo[143428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:36 compute-0 python3.9[143430]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:47:36 compute-0 sudo[143428]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:36 compute-0 sudo[143551]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfzxqbsxykuqkyevbokuxzjkqylajcil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326455.695848-1088-15622843269414/AnsiballZ_copy.py'
Oct 01 13:47:36 compute-0 sudo[143551]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:36 compute-0 python3.9[143553]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759326455.695848-1088-15622843269414/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:47:36 compute-0 sudo[143551]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:37 compute-0 sudo[143703]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shgwofvlyubqrikuzaicikxspfjipfgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326457.0892854-1088-215688918694978/AnsiballZ_stat.py'
Oct 01 13:47:37 compute-0 sudo[143703]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:37 compute-0 python3.9[143705]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:47:37 compute-0 sudo[143703]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:38 compute-0 sudo[143828]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwjnfhzoroorghtmgqmbjfwwiysxgrwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326457.0892854-1088-215688918694978/AnsiballZ_copy.py'
Oct 01 13:47:38 compute-0 sudo[143828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:38 compute-0 python3.9[143830]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759326457.0892854-1088-215688918694978/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:47:38 compute-0 sudo[143828]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:38 compute-0 sudo[143980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjcplfgxdetbjagvpyeyfmgkfgdxlxyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326458.6116557-1314-133942343266322/AnsiballZ_command.py'
Oct 01 13:47:38 compute-0 sudo[143980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:39 compute-0 python3.9[143982]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Oct 01 13:47:39 compute-0 sudo[143980]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:39 compute-0 sudo[144133]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzlzgfutqfiajgiebhyuudgcznibalhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326459.509697-1332-153593116393855/AnsiballZ_file.py'
Oct 01 13:47:39 compute-0 sudo[144133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:40 compute-0 python3.9[144135]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:47:40 compute-0 sudo[144133]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:40 compute-0 sudo[144285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpkheqavktkgmixprirhjaoluzzkpynx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326460.2947369-1332-96248802190107/AnsiballZ_file.py'
Oct 01 13:47:40 compute-0 sudo[144285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:40 compute-0 python3.9[144287]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:47:40 compute-0 sudo[144285]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:41 compute-0 sudo[144437]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmorhqmeetjfehljeyslgiavgupggmtl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326461.0485687-1332-142314134411998/AnsiballZ_file.py'
Oct 01 13:47:41 compute-0 sudo[144437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:41 compute-0 python3.9[144439]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:47:41 compute-0 sudo[144437]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:42 compute-0 sudo[144589]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajaacbbsabnssojgngsypyzlalbtnddm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326461.7765727-1332-112696278030359/AnsiballZ_file.py'
Oct 01 13:47:42 compute-0 sudo[144589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:42 compute-0 python3.9[144591]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:47:42 compute-0 sudo[144589]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:42 compute-0 sudo[144741]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mstfxgvazgpdbrelbtrsgkeudpvbgxeb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326462.612823-1332-45026869496308/AnsiballZ_file.py'
Oct 01 13:47:42 compute-0 sudo[144741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:43 compute-0 python3.9[144743]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:47:43 compute-0 sudo[144741]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:43 compute-0 sudo[144893]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zoolxpwlpfmhezeixufcgxhndabskqcv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326463.3983831-1332-173361331584566/AnsiballZ_file.py'
Oct 01 13:47:43 compute-0 sudo[144893]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:44 compute-0 python3.9[144895]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:47:44 compute-0 sudo[144893]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:44 compute-0 sudo[145045]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmieupimxgwpixnojmapeagzwzbqqkjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326464.2174175-1332-128556433390472/AnsiballZ_file.py'
Oct 01 13:47:44 compute-0 sudo[145045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:44 compute-0 python3.9[145047]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:47:44 compute-0 sudo[145045]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:45 compute-0 sudo[145197]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdnmaidckjouqezufidsxiiikttpbbxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326465.0168624-1332-268648175890747/AnsiballZ_file.py'
Oct 01 13:47:45 compute-0 sudo[145197]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:45 compute-0 python3.9[145199]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:47:45 compute-0 sudo[145197]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:46 compute-0 podman[145312]: 2025-10-01 13:47:46.182471723 +0000 UTC m=+0.089287657 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Oct 01 13:47:46 compute-0 sudo[145378]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktwulayzbkgqxbxwoxixusyjykkocheu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326465.7978392-1332-45305734137835/AnsiballZ_file.py'
Oct 01 13:47:46 compute-0 sudo[145378]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:46 compute-0 podman[145333]: 2025-10-01 13:47:46.303710455 +0000 UTC m=+0.137055556 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4)
Oct 01 13:47:46 compute-0 python3.9[145389]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:47:46 compute-0 sudo[145378]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:47 compute-0 sudo[145547]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njwtwldscaqulunbpbysbpjrnlzbkquj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326466.6876495-1332-190372215447940/AnsiballZ_file.py'
Oct 01 13:47:47 compute-0 sudo[145547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:47 compute-0 python3.9[145549]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:47:47 compute-0 sudo[145547]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:47 compute-0 sudo[145699]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymsvfyubtlgohmmommsjheeclygyvgtk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326467.4243329-1332-208117475957658/AnsiballZ_file.py'
Oct 01 13:47:47 compute-0 sudo[145699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:47 compute-0 python3.9[145701]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:47:48 compute-0 sudo[145699]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:48 compute-0 sudo[145851]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xsgbtdceagjmahfgtnnodxpkcuwfxqyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326468.1625516-1332-271689502447243/AnsiballZ_file.py'
Oct 01 13:47:48 compute-0 sudo[145851]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:48 compute-0 python3.9[145853]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:47:48 compute-0 sudo[145851]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:49 compute-0 sudo[146003]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zgewnobbmiazhdiukzusnjexmhqzcile ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326468.9037077-1332-135787204509363/AnsiballZ_file.py'
Oct 01 13:47:49 compute-0 sudo[146003]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:49 compute-0 python3.9[146005]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:47:49 compute-0 sudo[146003]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:50 compute-0 sudo[146155]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbjncmegmwnoorvhqglscxfwzlrpxhxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326469.6352382-1332-142334771753900/AnsiballZ_file.py'
Oct 01 13:47:50 compute-0 sudo[146155]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:50 compute-0 python3.9[146157]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:47:50 compute-0 sudo[146155]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:50 compute-0 sudo[146307]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nirmesnyqzadwrwfuxgefmcwuivjpapk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326470.469184-1530-68289918988128/AnsiballZ_stat.py'
Oct 01 13:47:50 compute-0 sudo[146307]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:51 compute-0 python3.9[146309]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:47:51 compute-0 sudo[146307]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:51 compute-0 sudo[146430]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdarlldxijmkzyatmbtoupdqtwgkhdwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326470.469184-1530-68289918988128/AnsiballZ_copy.py'
Oct 01 13:47:51 compute-0 sudo[146430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:51 compute-0 python3.9[146432]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759326470.469184-1530-68289918988128/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:47:51 compute-0 sudo[146430]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:52 compute-0 sudo[146582]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzjfsuvyszgegdzgervjgjewepbdifke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326471.9790351-1530-122078062904093/AnsiballZ_stat.py'
Oct 01 13:47:52 compute-0 sudo[146582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:52 compute-0 python3.9[146584]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:47:52 compute-0 sudo[146582]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:53 compute-0 sudo[146705]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-enxkmmrurpbgidhbgfuyuudjigvgdrxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326471.9790351-1530-122078062904093/AnsiballZ_copy.py'
Oct 01 13:47:53 compute-0 sudo[146705]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:53 compute-0 python3.9[146707]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759326471.9790351-1530-122078062904093/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:47:53 compute-0 sudo[146705]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:53 compute-0 sudo[146857]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhnjwuawerwbmhycbldxofnifriteofa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326473.4957142-1530-66407615816652/AnsiballZ_stat.py'
Oct 01 13:47:53 compute-0 sudo[146857]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:54 compute-0 python3.9[146859]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:47:54 compute-0 sudo[146857]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:54 compute-0 sudo[146980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cefhhhboosqblbcrimljgzepfnnmvuig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326473.4957142-1530-66407615816652/AnsiballZ_copy.py'
Oct 01 13:47:54 compute-0 sudo[146980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:54 compute-0 python3.9[146982]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759326473.4957142-1530-66407615816652/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:47:54 compute-0 sudo[146980]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:55 compute-0 sudo[147132]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebgswuzqiqccpboerxiglzmimdjuislm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326475.036826-1530-130006370387504/AnsiballZ_stat.py'
Oct 01 13:47:55 compute-0 sudo[147132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:55 compute-0 python3.9[147134]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:47:55 compute-0 sudo[147132]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:56 compute-0 sudo[147255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwunytngpooapfymytzclsjfuwgduyjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326475.036826-1530-130006370387504/AnsiballZ_copy.py'
Oct 01 13:47:56 compute-0 sudo[147255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:56 compute-0 python3.9[147257]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759326475.036826-1530-130006370387504/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:47:56 compute-0 sudo[147255]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:56 compute-0 sudo[147407]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjuqfpozjopcpsxzfbtdszzshgkdthzi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326476.5193512-1530-153364627833612/AnsiballZ_stat.py'
Oct 01 13:47:56 compute-0 sudo[147407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:57 compute-0 python3.9[147409]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:47:57 compute-0 sudo[147407]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:57 compute-0 sudo[147530]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tapipmbvtbmxsdxytplfcuezlkqtddey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326476.5193512-1530-153364627833612/AnsiballZ_copy.py'
Oct 01 13:47:57 compute-0 sudo[147530]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:57 compute-0 python3.9[147532]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759326476.5193512-1530-153364627833612/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:47:57 compute-0 sudo[147530]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:58 compute-0 sudo[147682]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xcmholqmtkaryaqlbxfdirtmqlxvdtyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326477.8662922-1530-125812679156502/AnsiballZ_stat.py'
Oct 01 13:47:58 compute-0 sudo[147682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:58 compute-0 python3.9[147684]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:47:58 compute-0 sudo[147682]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:58 compute-0 sudo[147805]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-quhxzotdwgvbrgfoyqtrsoneyjwxyslh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326477.8662922-1530-125812679156502/AnsiballZ_copy.py'
Oct 01 13:47:58 compute-0 sudo[147805]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:58 compute-0 python3.9[147807]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759326477.8662922-1530-125812679156502/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:47:58 compute-0 sudo[147805]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:59 compute-0 sudo[147957]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykbyktqjzttxnatuvotiazkbzhjnvhem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326479.0307963-1530-62417569943940/AnsiballZ_stat.py'
Oct 01 13:47:59 compute-0 sudo[147957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:47:59 compute-0 python3.9[147959]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:47:59 compute-0 sudo[147957]: pam_unix(sudo:session): session closed for user root
Oct 01 13:47:59 compute-0 sudo[148080]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kefuatzseomnjdjidlaujwburgkchxur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326479.0307963-1530-62417569943940/AnsiballZ_copy.py'
Oct 01 13:47:59 compute-0 sudo[148080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:00 compute-0 python3.9[148082]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759326479.0307963-1530-62417569943940/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:48:00 compute-0 sudo[148080]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:00 compute-0 sudo[148232]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmrmoqqkwtvjhwtfemtuoxnvugmowhcc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326480.3176348-1530-83523398841905/AnsiballZ_stat.py'
Oct 01 13:48:00 compute-0 sudo[148232]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:00 compute-0 python3.9[148234]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:48:00 compute-0 sudo[148232]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:01 compute-0 sudo[148355]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxfjeoyovmkrpbrypvbhdagfwsdlyhiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326480.3176348-1530-83523398841905/AnsiballZ_copy.py'
Oct 01 13:48:01 compute-0 sudo[148355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:01 compute-0 python3.9[148357]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759326480.3176348-1530-83523398841905/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:48:01 compute-0 sudo[148355]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:02 compute-0 sudo[148507]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzzcbluogrzpsifvdhhcvubmajyewhof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326481.6680171-1530-108354302967010/AnsiballZ_stat.py'
Oct 01 13:48:02 compute-0 sudo[148507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:02 compute-0 python3.9[148509]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:48:02 compute-0 sudo[148507]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:02 compute-0 sudo[148630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hijvqmqajhrqlrybskiaktquaifgciys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326481.6680171-1530-108354302967010/AnsiballZ_copy.py'
Oct 01 13:48:02 compute-0 sudo[148630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:02 compute-0 python3.9[148632]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759326481.6680171-1530-108354302967010/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:48:02 compute-0 sudo[148630]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:03 compute-0 sudo[148782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjzypfnhattpwvypbqkgsductbsxhlbf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326483.0889692-1530-216796164182357/AnsiballZ_stat.py'
Oct 01 13:48:03 compute-0 sudo[148782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:03 compute-0 python3.9[148784]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:48:03 compute-0 sudo[148782]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:04 compute-0 sudo[148905]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwmhtqfsisungdnzpzcxrhncgvnasojt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326483.0889692-1530-216796164182357/AnsiballZ_copy.py'
Oct 01 13:48:04 compute-0 sudo[148905]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:04 compute-0 python3.9[148907]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759326483.0889692-1530-216796164182357/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:48:04 compute-0 sudo[148905]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:04 compute-0 sudo[149057]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iczwrsznzmtwhpjthmlkpojlwqyddeiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326484.5278504-1530-128366146918122/AnsiballZ_stat.py'
Oct 01 13:48:04 compute-0 sudo[149057]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:05 compute-0 python3.9[149059]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:48:05 compute-0 sudo[149057]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:05 compute-0 sudo[149180]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dpknztwwvxfycygncpqdxyffglmnggyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326484.5278504-1530-128366146918122/AnsiballZ_copy.py'
Oct 01 13:48:05 compute-0 sudo[149180]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:05 compute-0 python3.9[149182]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759326484.5278504-1530-128366146918122/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:48:05 compute-0 sudo[149180]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:06 compute-0 sudo[149332]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qnvfrpilxjagvaymfegavuywxkeziqra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326485.9958096-1530-141195860360860/AnsiballZ_stat.py'
Oct 01 13:48:06 compute-0 sudo[149332]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:06 compute-0 python3.9[149334]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:48:06 compute-0 sudo[149332]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:06 compute-0 sudo[149455]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bycqqfvjclhkjriunekcthmvcjdjpfpt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326485.9958096-1530-141195860360860/AnsiballZ_copy.py'
Oct 01 13:48:06 compute-0 sudo[149455]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:07 compute-0 python3.9[149457]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759326485.9958096-1530-141195860360860/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:48:07 compute-0 sudo[149455]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:07 compute-0 sudo[149607]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpnjmazimabgaudrqgbpnooqmmmtyesu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326487.3587036-1530-19909338074606/AnsiballZ_stat.py'
Oct 01 13:48:07 compute-0 sudo[149607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:07 compute-0 python3.9[149609]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:48:07 compute-0 sudo[149607]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:08 compute-0 sudo[149730]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nuiebkltnvjnjbbvaefeppfoqbpeykri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326487.3587036-1530-19909338074606/AnsiballZ_copy.py'
Oct 01 13:48:08 compute-0 sudo[149730]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:08 compute-0 python3.9[149732]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759326487.3587036-1530-19909338074606/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:48:08 compute-0 sudo[149730]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:09 compute-0 sudo[149882]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uqozczhdiokcbeaumvrfpejthlgsynxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326488.8000474-1530-154801952200089/AnsiballZ_stat.py'
Oct 01 13:48:09 compute-0 sudo[149882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:09 compute-0 python3.9[149884]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:48:09 compute-0 sudo[149882]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:09 compute-0 sudo[150005]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grzrsbvrjkfaofwqvnucvihzkcxtssno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326488.8000474-1530-154801952200089/AnsiballZ_copy.py'
Oct 01 13:48:09 compute-0 sudo[150005]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:10 compute-0 python3.9[150007]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759326488.8000474-1530-154801952200089/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:48:10 compute-0 sudo[150005]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:10 compute-0 python3.9[150157]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:48:11 compute-0 sudo[150311]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdcaedcjcflzgeigsfhulyvfjxigeuvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326491.2085786-1942-157178399342097/AnsiballZ_seboolean.py'
Oct 01 13:48:11 compute-0 sudo[150311]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:12 compute-0 python3.9[150313]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Oct 01 13:48:12 compute-0 sshd-session[150161]: error: kex_exchange_identification: read: Connection reset by peer
Oct 01 13:48:12 compute-0 sshd-session[150161]: Connection reset by 152.53.80.20 port 39796
Oct 01 13:48:13 compute-0 sudo[150311]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:13 compute-0 sudo[150467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tegqyuekpvifpcnfstlmxkdcbxrzzkha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326493.502593-1958-39878343134141/AnsiballZ_copy.py'
Oct 01 13:48:13 compute-0 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=15 res=1
Oct 01 13:48:13 compute-0 sudo[150467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:14 compute-0 python3.9[150469]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:48:14 compute-0 sudo[150467]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:48:14.183 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 13:48:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:48:14.188 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.006s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 13:48:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:48:14.188 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 13:48:14 compute-0 sudo[150620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-crruclcjjfqamabfwnfbtzuteaompsvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326494.3206382-1958-209888306212984/AnsiballZ_copy.py'
Oct 01 13:48:14 compute-0 sudo[150620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:14 compute-0 python3.9[150622]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:48:14 compute-0 sudo[150620]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:15 compute-0 sudo[150772]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zobsnpsciqjqchfbbuneuskqkgawxgpd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326495.0990067-1958-216809045589272/AnsiballZ_copy.py'
Oct 01 13:48:15 compute-0 sudo[150772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:15 compute-0 python3.9[150774]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:48:15 compute-0 sudo[150772]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:16 compute-0 podman[150898]: 2025-10-01 13:48:16.392911171 +0000 UTC m=+0.100291367 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Oct 01 13:48:16 compute-0 sudo[150937]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzlieocglmmsuacdqbmtjjgzweaohnro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326495.9319363-1958-127280195653617/AnsiballZ_copy.py'
Oct 01 13:48:16 compute-0 sudo[150937]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:16 compute-0 podman[150945]: 2025-10-01 13:48:16.601643905 +0000 UTC m=+0.186610121 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.build-date=20250930)
Oct 01 13:48:16 compute-0 python3.9[150946]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:48:16 compute-0 sudo[150937]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:17 compute-0 sudo[151122]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxvylowfajbguqlhdnigpjzzcxgecevv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326496.815356-1958-241638380402269/AnsiballZ_copy.py'
Oct 01 13:48:17 compute-0 sudo[151122]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:17 compute-0 python3.9[151124]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:48:17 compute-0 sudo[151122]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:18 compute-0 sudo[151274]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mubikizlnqfgkpdqawbsluojckwgggvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326497.720666-2030-167835629354892/AnsiballZ_copy.py'
Oct 01 13:48:18 compute-0 sudo[151274]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:18 compute-0 python3.9[151276]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:48:18 compute-0 sudo[151274]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:18 compute-0 sudo[151426]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ioxdhgpnsgtworxcwamrinwepudyosch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326498.5890393-2030-6247023036743/AnsiballZ_copy.py'
Oct 01 13:48:18 compute-0 sudo[151426]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:19 compute-0 python3.9[151428]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:48:19 compute-0 sudo[151426]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:19 compute-0 sudo[151578]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzlrtcyxztllxwbavaeofitsmutpdqbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326499.3553002-2030-91044305628418/AnsiballZ_copy.py'
Oct 01 13:48:19 compute-0 sudo[151578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:19 compute-0 python3.9[151580]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:48:19 compute-0 sudo[151578]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:20 compute-0 sudo[151730]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbnkcrkclcfvozcktnysfxauxxabzgbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326500.0161772-2030-60583343767222/AnsiballZ_copy.py'
Oct 01 13:48:20 compute-0 sudo[151730]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:20 compute-0 python3.9[151732]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:48:20 compute-0 sudo[151730]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:21 compute-0 sudo[151882]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvpadgenfornryohkjqlenzqxuupkygm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326500.7790966-2030-111002312970924/AnsiballZ_copy.py'
Oct 01 13:48:21 compute-0 sudo[151882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:21 compute-0 python3.9[151884]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:48:21 compute-0 sudo[151882]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:21 compute-0 sudo[152034]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pevyzxldlqpyonmubttokqtgdidhbjnf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326501.6329527-2102-146827981473612/AnsiballZ_systemd.py'
Oct 01 13:48:21 compute-0 sudo[152034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:22 compute-0 python3.9[152036]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 01 13:48:22 compute-0 systemd[1]: Reloading.
Oct 01 13:48:22 compute-0 systemd-rc-local-generator[152059]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:48:22 compute-0 systemd-sysv-generator[152064]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:48:22 compute-0 systemd[1]: Starting libvirt logging daemon socket...
Oct 01 13:48:22 compute-0 systemd[1]: Listening on libvirt logging daemon socket.
Oct 01 13:48:22 compute-0 systemd[1]: Starting libvirt logging daemon admin socket...
Oct 01 13:48:22 compute-0 systemd[1]: Listening on libvirt logging daemon admin socket.
Oct 01 13:48:22 compute-0 systemd[1]: Starting libvirt logging daemon...
Oct 01 13:48:22 compute-0 systemd[1]: Started libvirt logging daemon.
Oct 01 13:48:22 compute-0 sudo[152034]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:23 compute-0 sudo[152226]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdboxsncbjlzgqwjplucoymcembgdjrx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326502.904384-2102-15627088748713/AnsiballZ_systemd.py'
Oct 01 13:48:23 compute-0 sudo[152226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:23 compute-0 python3.9[152228]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 01 13:48:23 compute-0 systemd[1]: Reloading.
Oct 01 13:48:23 compute-0 systemd-sysv-generator[152257]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:48:23 compute-0 systemd-rc-local-generator[152251]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:48:23 compute-0 systemd[1]: Starting libvirt nodedev daemon socket...
Oct 01 13:48:23 compute-0 systemd[1]: Listening on libvirt nodedev daemon socket.
Oct 01 13:48:23 compute-0 systemd[1]: Starting libvirt nodedev daemon admin socket...
Oct 01 13:48:23 compute-0 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Oct 01 13:48:23 compute-0 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Oct 01 13:48:23 compute-0 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Oct 01 13:48:23 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Oct 01 13:48:23 compute-0 systemd[1]: Started libvirt nodedev daemon.
Oct 01 13:48:24 compute-0 sudo[152226]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:24 compute-0 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Oct 01 13:48:24 compute-0 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Oct 01 13:48:24 compute-0 sudo[152446]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzvalghwdrchdbsuouquosscnvkzztam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326504.1926434-2102-267106022934163/AnsiballZ_systemd.py'
Oct 01 13:48:24 compute-0 sudo[152446]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:24 compute-0 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Oct 01 13:48:24 compute-0 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Oct 01 13:48:24 compute-0 python3.9[152451]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 01 13:48:24 compute-0 systemd[1]: Reloading.
Oct 01 13:48:24 compute-0 systemd-rc-local-generator[152478]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:48:24 compute-0 systemd-sysv-generator[152484]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:48:25 compute-0 systemd[1]: Starting libvirt proxy daemon admin socket...
Oct 01 13:48:25 compute-0 systemd[1]: Starting libvirt proxy daemon read-only socket...
Oct 01 13:48:25 compute-0 systemd[1]: Listening on libvirt proxy daemon admin socket.
Oct 01 13:48:25 compute-0 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Oct 01 13:48:25 compute-0 systemd[1]: Starting libvirt proxy daemon...
Oct 01 13:48:25 compute-0 systemd[1]: Started libvirt proxy daemon.
Oct 01 13:48:25 compute-0 sudo[152446]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:25 compute-0 setroubleshoot[152291]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 77bc0dd7-d330-4c02-9158-bffd173d07dd
Oct 01 13:48:25 compute-0 setroubleshoot[152291]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Oct 01 13:48:25 compute-0 setroubleshoot[152291]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 77bc0dd7-d330-4c02-9158-bffd173d07dd
Oct 01 13:48:25 compute-0 setroubleshoot[152291]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Oct 01 13:48:25 compute-0 sudo[152662]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlizlpelhyewleoocxlkyzmotaybsxso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326505.459119-2102-214054493797186/AnsiballZ_systemd.py'
Oct 01 13:48:25 compute-0 sudo[152662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:26 compute-0 python3.9[152664]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 01 13:48:26 compute-0 systemd[1]: Reloading.
Oct 01 13:48:26 compute-0 systemd-rc-local-generator[152691]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:48:26 compute-0 systemd-sysv-generator[152694]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:48:26 compute-0 systemd[1]: Listening on libvirt locking daemon socket.
Oct 01 13:48:26 compute-0 systemd[1]: Starting libvirt QEMU daemon socket...
Oct 01 13:48:26 compute-0 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Oct 01 13:48:26 compute-0 systemd[1]: Starting Virtual Machine and Container Registration Service...
Oct 01 13:48:26 compute-0 systemd[1]: Listening on libvirt QEMU daemon socket.
Oct 01 13:48:26 compute-0 systemd[1]: Starting libvirt QEMU daemon admin socket...
Oct 01 13:48:26 compute-0 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Oct 01 13:48:26 compute-0 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Oct 01 13:48:26 compute-0 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Oct 01 13:48:26 compute-0 systemd[1]: Started Virtual Machine and Container Registration Service.
Oct 01 13:48:26 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Oct 01 13:48:26 compute-0 systemd[1]: Started libvirt QEMU daemon.
Oct 01 13:48:26 compute-0 sudo[152662]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:27 compute-0 sudo[152875]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-npbiwztnamufoepdprxfklfnlpvtoozt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326506.844928-2102-36835664187410/AnsiballZ_systemd.py'
Oct 01 13:48:27 compute-0 sudo[152875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:27 compute-0 python3.9[152877]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 01 13:48:27 compute-0 systemd[1]: Reloading.
Oct 01 13:48:27 compute-0 systemd-sysv-generator[152907]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:48:27 compute-0 systemd-rc-local-generator[152902]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:48:27 compute-0 systemd[1]: Starting libvirt secret daemon socket...
Oct 01 13:48:27 compute-0 systemd[1]: Listening on libvirt secret daemon socket.
Oct 01 13:48:27 compute-0 systemd[1]: Starting libvirt secret daemon admin socket...
Oct 01 13:48:27 compute-0 systemd[1]: Starting libvirt secret daemon read-only socket...
Oct 01 13:48:27 compute-0 systemd[1]: Listening on libvirt secret daemon admin socket.
Oct 01 13:48:27 compute-0 systemd[1]: Listening on libvirt secret daemon read-only socket.
Oct 01 13:48:27 compute-0 systemd[1]: Starting libvirt secret daemon...
Oct 01 13:48:27 compute-0 systemd[1]: Started libvirt secret daemon.
Oct 01 13:48:27 compute-0 sudo[152875]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:28 compute-0 sudo[153085]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lyjfauwhwoijrqycjpdwzdgigigryhhp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326508.331965-2176-190012802699176/AnsiballZ_file.py'
Oct 01 13:48:28 compute-0 sudo[153085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:28 compute-0 python3.9[153087]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:48:28 compute-0 sudo[153085]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:29 compute-0 sudo[153237]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvrwlbuuyzfgnmftyskrcaduvkkviihv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326509.0947149-2192-163888710261568/AnsiballZ_find.py'
Oct 01 13:48:29 compute-0 sudo[153237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:29 compute-0 python3.9[153239]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct 01 13:48:29 compute-0 sudo[153237]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:30 compute-0 sudo[153389]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwstgxdlsdfpdjgiqjhcztkoxmespivz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326510.1992083-2220-197403893181684/AnsiballZ_stat.py'
Oct 01 13:48:30 compute-0 sudo[153389]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:30 compute-0 python3.9[153391]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:48:30 compute-0 sudo[153389]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:31 compute-0 sudo[153512]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-labcvztuyfioxsovioilcgqcjsuunvxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326510.1992083-2220-197403893181684/AnsiballZ_copy.py'
Oct 01 13:48:31 compute-0 sudo[153512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:31 compute-0 python3.9[153514]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1759326510.1992083-2220-197403893181684/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:48:31 compute-0 sudo[153512]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:32 compute-0 sudo[153664]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ybepsosntmmloblrehksvipgcrnbsprj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326511.8086512-2252-199834415101053/AnsiballZ_file.py'
Oct 01 13:48:32 compute-0 sudo[153664]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:32 compute-0 python3.9[153666]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:48:32 compute-0 sudo[153664]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:32 compute-0 sudo[153816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-woxezexlwqmrxddutwyyfgphyteqeaqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326512.5551598-2268-220833028552371/AnsiballZ_stat.py'
Oct 01 13:48:32 compute-0 sudo[153816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:33 compute-0 python3.9[153818]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:48:33 compute-0 sudo[153816]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:33 compute-0 sudo[153894]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-myclwjuhigmhghpnkigfhskimiqshzce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326512.5551598-2268-220833028552371/AnsiballZ_file.py'
Oct 01 13:48:33 compute-0 sudo[153894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:33 compute-0 python3.9[153896]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:48:33 compute-0 sudo[153894]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:34 compute-0 sudo[154046]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ysurzxavehkasglssbreahziagjhmcty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326513.929678-2292-120769898468389/AnsiballZ_stat.py'
Oct 01 13:48:34 compute-0 sudo[154046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:34 compute-0 python3.9[154048]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:48:34 compute-0 sudo[154046]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:34 compute-0 sudo[154124]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jitnzclynwmdcgdftlolbxpszevrjuct ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326513.929678-2292-120769898468389/AnsiballZ_file.py'
Oct 01 13:48:34 compute-0 sudo[154124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:35 compute-0 python3.9[154126]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.vzqiox18 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:48:35 compute-0 sudo[154124]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:35 compute-0 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Oct 01 13:48:35 compute-0 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Consumed 1.016s CPU time.
Oct 01 13:48:35 compute-0 systemd[1]: setroubleshootd.service: Deactivated successfully.
Oct 01 13:48:35 compute-0 sudo[154277]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yydmxhsrqvwzxcupcwiunqwdmsidqkwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326515.2771769-2316-246374731873993/AnsiballZ_stat.py'
Oct 01 13:48:35 compute-0 sudo[154277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:35 compute-0 python3.9[154279]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:48:35 compute-0 sudo[154277]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:36 compute-0 sudo[154355]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vdmvwyfapdonmwrupegyzrmmlpmbllot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326515.2771769-2316-246374731873993/AnsiballZ_file.py'
Oct 01 13:48:36 compute-0 sudo[154355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:36 compute-0 python3.9[154357]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:48:36 compute-0 sudo[154355]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:37 compute-0 sudo[154507]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shxfqebnjujgjazwahbimcdtkcqywhlz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326516.7855644-2342-201212986701950/AnsiballZ_command.py'
Oct 01 13:48:37 compute-0 sudo[154507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:37 compute-0 python3.9[154509]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:48:37 compute-0 sudo[154507]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:38 compute-0 sudo[154660]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hpiusfinbyjfsjfftiggeeqtcabmeqtc ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759326517.6558447-2358-3016408489178/AnsiballZ_edpm_nftables_from_files.py'
Oct 01 13:48:38 compute-0 sudo[154660]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:38 compute-0 python3[154662]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Oct 01 13:48:38 compute-0 sudo[154660]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:39 compute-0 sudo[154812]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cshnmhlagkcguijlhilbxxmgnqziysmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326518.725395-2374-117912844557800/AnsiballZ_stat.py'
Oct 01 13:48:39 compute-0 sudo[154812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:39 compute-0 python3.9[154814]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:48:39 compute-0 sudo[154812]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:39 compute-0 sudo[154890]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvdzchabbhelozpbifpnztaqcspkzpjs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326518.725395-2374-117912844557800/AnsiballZ_file.py'
Oct 01 13:48:39 compute-0 sudo[154890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:39 compute-0 python3.9[154892]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:48:39 compute-0 sudo[154890]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:40 compute-0 sudo[155042]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ucngthwghlrbphtroqmfcctkdaymueyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326520.1459215-2398-72077585942666/AnsiballZ_stat.py'
Oct 01 13:48:40 compute-0 sudo[155042]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:40 compute-0 python3.9[155044]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:48:40 compute-0 sudo[155042]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:41 compute-0 sudo[155120]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ecwlumibqhtxqiwyxzjbivywgmzztvpg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326520.1459215-2398-72077585942666/AnsiballZ_file.py'
Oct 01 13:48:41 compute-0 sudo[155120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:41 compute-0 python3.9[155122]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:48:41 compute-0 sudo[155120]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:42 compute-0 sudo[155272]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chypyjzidbxduolhhhbgaquqjqfiuhye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326521.6585498-2422-148697410578010/AnsiballZ_stat.py'
Oct 01 13:48:42 compute-0 sudo[155272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:42 compute-0 python3.9[155274]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:48:42 compute-0 sudo[155272]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:42 compute-0 sudo[155350]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ncpmnqimjfnqiijvcuommeedpausnbzj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326521.6585498-2422-148697410578010/AnsiballZ_file.py'
Oct 01 13:48:42 compute-0 sudo[155350]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:42 compute-0 python3.9[155352]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:48:42 compute-0 sudo[155350]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:43 compute-0 sudo[155502]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdazpihchxcqoltspvasibgyihxayxhf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326523.1186268-2446-265986612247888/AnsiballZ_stat.py'
Oct 01 13:48:43 compute-0 sudo[155502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:43 compute-0 python3.9[155504]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:48:43 compute-0 sudo[155502]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:44 compute-0 sudo[155580]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-msjhqcazjcqifwepccqtygkmfpixbohl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326523.1186268-2446-265986612247888/AnsiballZ_file.py'
Oct 01 13:48:44 compute-0 sudo[155580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:44 compute-0 python3.9[155582]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:48:44 compute-0 sudo[155580]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:45 compute-0 sudo[155732]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rrgvnpcttzlyguehpiuvywvudpglzsnd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326524.551608-2470-22262002104505/AnsiballZ_stat.py'
Oct 01 13:48:45 compute-0 sudo[155732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:45 compute-0 python3.9[155734]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:48:45 compute-0 sudo[155732]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:45 compute-0 sudo[155857]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lacnorurkddulkldxtebrmgermamonge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326524.551608-2470-22262002104505/AnsiballZ_copy.py'
Oct 01 13:48:45 compute-0 sudo[155857]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:46 compute-0 python3.9[155859]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759326524.551608-2470-22262002104505/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:48:46 compute-0 sudo[155857]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:46 compute-0 sudo[156039]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-smjdsxfslygqmtaoqmqukozlenqvtryh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326526.3633184-2500-170578234111403/AnsiballZ_file.py'
Oct 01 13:48:46 compute-0 sudo[156039]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:46 compute-0 podman[155983]: 2025-10-01 13:48:46.816125082 +0000 UTC m=+0.093714758 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Oct 01 13:48:46 compute-0 podman[155984]: 2025-10-01 13:48:46.92347088 +0000 UTC m=+0.200468189 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Oct 01 13:48:47 compute-0 python3.9[156049]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:48:47 compute-0 sudo[156039]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:47 compute-0 sudo[156207]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsxnfbxblswrjinrqhddlgmfuiiwnltr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326527.320606-2516-152848389330838/AnsiballZ_command.py'
Oct 01 13:48:47 compute-0 sudo[156207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:47 compute-0 python3.9[156209]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:48:48 compute-0 sudo[156207]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:48 compute-0 sudo[156362]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzuklzkcngsbvlyhcopwienytqggpfgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326528.257835-2532-206803554121484/AnsiballZ_blockinfile.py'
Oct 01 13:48:48 compute-0 sudo[156362]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:48 compute-0 python3.9[156364]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:48:49 compute-0 sudo[156362]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:49 compute-0 sudo[156514]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfjltcocgdstclqulojcehtvjkvjxulq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326529.3544014-2550-62216564876716/AnsiballZ_command.py'
Oct 01 13:48:49 compute-0 sudo[156514]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:49 compute-0 python3.9[156516]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:48:50 compute-0 sudo[156514]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:50 compute-0 sudo[156667]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgkhuhwltxitjljtiyjaltsbinnhthrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326530.2302055-2566-53486971996037/AnsiballZ_stat.py'
Oct 01 13:48:50 compute-0 sudo[156667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:50 compute-0 python3.9[156669]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 01 13:48:50 compute-0 sudo[156667]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:51 compute-0 sudo[156821]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wupnfqdxxiqviijutrwdaxxymgjrdmsy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326531.068878-2582-159325367518402/AnsiballZ_command.py'
Oct 01 13:48:51 compute-0 sudo[156821]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:51 compute-0 python3.9[156823]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:48:51 compute-0 sudo[156821]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:52 compute-0 sudo[156976]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ljcwatyyltmxhxeslsfxyvmifucxxygb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326531.9487424-2598-169500309733489/AnsiballZ_file.py'
Oct 01 13:48:52 compute-0 sudo[156976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:52 compute-0 python3.9[156978]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:48:52 compute-0 sudo[156976]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:53 compute-0 sudo[157128]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvfbdjkvnkzhvaocmebstrxapurtdqog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326532.8450131-2614-56361793735535/AnsiballZ_stat.py'
Oct 01 13:48:53 compute-0 sudo[157128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:53 compute-0 python3.9[157130]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:48:53 compute-0 sudo[157128]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:53 compute-0 sudo[157251]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oucfufzoabkgkkksneexxqioynahsmbf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326532.8450131-2614-56361793735535/AnsiballZ_copy.py'
Oct 01 13:48:53 compute-0 sudo[157251]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:54 compute-0 python3.9[157253]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759326532.8450131-2614-56361793735535/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:48:54 compute-0 sudo[157251]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:54 compute-0 sudo[157403]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aaqlqvksibolftplnsrktrijvzjwarbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326534.245212-2644-205102604784229/AnsiballZ_stat.py'
Oct 01 13:48:54 compute-0 sudo[157403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:54 compute-0 python3.9[157405]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:48:54 compute-0 sudo[157403]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:55 compute-0 sudo[157526]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjtrqqiwexcjepxmbpljvywohexlcaky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326534.245212-2644-205102604784229/AnsiballZ_copy.py'
Oct 01 13:48:55 compute-0 sudo[157526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:55 compute-0 python3.9[157528]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759326534.245212-2644-205102604784229/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:48:55 compute-0 sudo[157526]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:56 compute-0 sudo[157678]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvkwzlzcpzongcahkrwtfjssysnhsdls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326535.8293197-2674-181855399371295/AnsiballZ_stat.py'
Oct 01 13:48:56 compute-0 sudo[157678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:56 compute-0 python3.9[157680]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:48:56 compute-0 sudo[157678]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:56 compute-0 sudo[157801]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eidwvkmsdibdrmxhzxsqoswbqmkjsiuk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326535.8293197-2674-181855399371295/AnsiballZ_copy.py'
Oct 01 13:48:56 compute-0 sudo[157801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:57 compute-0 python3.9[157803]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759326535.8293197-2674-181855399371295/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:48:57 compute-0 sudo[157801]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:57 compute-0 sudo[157953]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vttlgkzuotsddmjtqvudyxbpuwrhomtr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326537.33213-2704-138148603905330/AnsiballZ_systemd.py'
Oct 01 13:48:57 compute-0 sudo[157953]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:58 compute-0 python3.9[157955]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 01 13:48:58 compute-0 systemd[1]: Reloading.
Oct 01 13:48:58 compute-0 systemd-rc-local-generator[157980]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:48:58 compute-0 systemd-sysv-generator[157984]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:48:58 compute-0 systemd[1]: Reached target edpm_libvirt.target.
Oct 01 13:48:58 compute-0 sudo[157953]: pam_unix(sudo:session): session closed for user root
Oct 01 13:48:59 compute-0 sudo[158144]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxjblpfyqqmsrypdtrvcolxodytfdzso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326538.6770904-2720-82618755851642/AnsiballZ_systemd.py'
Oct 01 13:48:59 compute-0 sudo[158144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:48:59 compute-0 python3.9[158146]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Oct 01 13:48:59 compute-0 systemd[1]: Reloading.
Oct 01 13:48:59 compute-0 systemd-rc-local-generator[158173]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:48:59 compute-0 systemd-sysv-generator[158178]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:48:59 compute-0 systemd[1]: Reloading.
Oct 01 13:48:59 compute-0 systemd-rc-local-generator[158213]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:48:59 compute-0 systemd-sysv-generator[158217]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:49:00 compute-0 sudo[158144]: pam_unix(sudo:session): session closed for user root
Oct 01 13:49:00 compute-0 sshd-session[103918]: Connection closed by 192.168.122.30 port 44150
Oct 01 13:49:00 compute-0 sshd-session[103915]: pam_unix(sshd:session): session closed for user zuul
Oct 01 13:49:00 compute-0 systemd[1]: session-24.scope: Deactivated successfully.
Oct 01 13:49:00 compute-0 systemd[1]: session-24.scope: Consumed 3min 56.223s CPU time.
Oct 01 13:49:00 compute-0 systemd-logind[791]: Session 24 logged out. Waiting for processes to exit.
Oct 01 13:49:00 compute-0 systemd-logind[791]: Removed session 24.
Oct 01 13:49:05 compute-0 sshd-session[158243]: Accepted publickey for zuul from 192.168.122.30 port 48850 ssh2: ECDSA SHA256:G/wBH4NemtaB5A4Xrsc6R+GZmi6HC8VbviS/FKhdd8M
Oct 01 13:49:05 compute-0 systemd-logind[791]: New session 25 of user zuul.
Oct 01 13:49:05 compute-0 systemd[1]: Started Session 25 of User zuul.
Oct 01 13:49:05 compute-0 sshd-session[158243]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 01 13:49:06 compute-0 python3.9[158396]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 01 13:49:07 compute-0 sudo[158550]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpfqwbheibgjloewuyuurvqisuthhihd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326547.3979845-48-241353837745070/AnsiballZ_file.py'
Oct 01 13:49:07 compute-0 sudo[158550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:49:08 compute-0 python3.9[158552]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:49:08 compute-0 sudo[158550]: pam_unix(sudo:session): session closed for user root
Oct 01 13:49:08 compute-0 sudo[158702]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lthguifcwuvouitgnnjkwmizqezfwydg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326548.4040139-48-197259495049403/AnsiballZ_file.py'
Oct 01 13:49:08 compute-0 sudo[158702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:49:09 compute-0 python3.9[158704]: ansible-ansible.builtin.file Invoked with path=/etc/target setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:49:09 compute-0 sudo[158702]: pam_unix(sudo:session): session closed for user root
Oct 01 13:49:09 compute-0 sudo[158854]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-menksqqftudmtzaoblzopfgkzykwbkpj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326549.2210708-48-190003674845609/AnsiballZ_file.py'
Oct 01 13:49:09 compute-0 sudo[158854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:49:09 compute-0 python3.9[158856]: ansible-ansible.builtin.file Invoked with path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:49:09 compute-0 sudo[158854]: pam_unix(sudo:session): session closed for user root
Oct 01 13:49:10 compute-0 sudo[159006]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iiflescefprpnvegyuvtpxvcdwesazdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326550.0382113-48-137931566139276/AnsiballZ_file.py'
Oct 01 13:49:10 compute-0 sudo[159006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:49:10 compute-0 python3.9[159008]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/config-data selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct 01 13:49:10 compute-0 sudo[159006]: pam_unix(sudo:session): session closed for user root
Oct 01 13:49:11 compute-0 sudo[159158]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-clhkzjgvkznxqrrzlfdvjqjomrmipowb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326550.9329646-48-8045745774198/AnsiballZ_file.py'
Oct 01 13:49:11 compute-0 sudo[159158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:49:11 compute-0 python3.9[159160]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/config-data/ansible-generated/iscsid setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:49:11 compute-0 sudo[159158]: pam_unix(sudo:session): session closed for user root
Oct 01 13:49:12 compute-0 sudo[159310]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhjxmerbqlqcksrsmjeeywfvbvsdgcly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326551.8090558-120-150497691060077/AnsiballZ_stat.py'
Oct 01 13:49:12 compute-0 sudo[159310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:49:12 compute-0 python3.9[159312]: ansible-ansible.builtin.stat Invoked with path=/lib/systemd/system/iscsid.socket follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 01 13:49:12 compute-0 sudo[159310]: pam_unix(sudo:session): session closed for user root
Oct 01 13:49:13 compute-0 sudo[159464]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-estgryiozdctrfytxtaeccleiapenuin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326552.8934286-136-33490984919090/AnsiballZ_systemd.py'
Oct 01 13:49:13 compute-0 sudo[159464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:49:13 compute-0 python3.9[159466]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iscsid.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 01 13:49:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:49:14.190 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 13:49:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:49:14.192 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 13:49:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:49:14.192 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 13:49:14 compute-0 systemd[1]: Reloading.
Oct 01 13:49:15 compute-0 systemd-rc-local-generator[159495]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:49:15 compute-0 systemd-sysv-generator[159500]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:49:15 compute-0 sudo[159464]: pam_unix(sudo:session): session closed for user root
Oct 01 13:49:16 compute-0 sudo[159654]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cduqeotbmjssvjpmlxcjdgfbwunkvzzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326555.576846-152-78107181787650/AnsiballZ_service_facts.py'
Oct 01 13:49:16 compute-0 sudo[159654]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:49:16 compute-0 python3.9[159656]: ansible-ansible.builtin.service_facts Invoked
Oct 01 13:49:16 compute-0 network[159673]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 01 13:49:16 compute-0 network[159674]: 'network-scripts' will be removed from distribution in near future.
Oct 01 13:49:16 compute-0 network[159675]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 01 13:49:17 compute-0 podman[159681]: 2025-10-01 13:49:17.360778909 +0000 UTC m=+0.094698308 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=watcher_latest)
Oct 01 13:49:17 compute-0 podman[159683]: 2025-10-01 13:49:17.427346475 +0000 UTC m=+0.162169319 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Oct 01 13:49:22 compute-0 sudo[159654]: pam_unix(sudo:session): session closed for user root
Oct 01 13:49:22 compute-0 sudo[159992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xupiuadqhuwuisydvswlpmqrkamfkhor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326562.336099-168-121870804941430/AnsiballZ_systemd.py'
Oct 01 13:49:22 compute-0 sudo[159992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:49:23 compute-0 python3.9[159994]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iscsi-starter.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 01 13:49:23 compute-0 systemd[1]: Reloading.
Oct 01 13:49:23 compute-0 systemd-rc-local-generator[160026]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:49:23 compute-0 systemd-sysv-generator[160029]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:49:23 compute-0 sudo[159992]: pam_unix(sudo:session): session closed for user root
Oct 01 13:49:24 compute-0 python3.9[160183]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 01 13:49:25 compute-0 sudo[160333]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fotacatqkzphkvabszyrkotkmmddfvaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326564.55966-202-168465646438014/AnsiballZ_podman_container.py'
Oct 01 13:49:25 compute-0 sudo[160333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:49:25 compute-0 python3.9[160335]: ansible-containers.podman.podman_container Invoked with command=/usr/sbin/iscsi-iname detach=False image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest name=iscsid_config rm=True tty=True executable=podman state=started debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Oct 01 13:49:25 compute-0 podman[160373]: 2025-10-01 13:49:25.723425783 +0000 UTC m=+0.074519445 container create 853b16c83b351f8f021851440bef869891451bdf2369a39558d6763f654b49d8 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid_config, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Oct 01 13:49:25 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 01 13:49:25 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 01 13:49:25 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 01 13:49:25 compute-0 NetworkManager[51741]: <info>  [1759326565.7704] manager: (podman0): new Bridge device (/org/freedesktop/NetworkManager/Devices/20)
Oct 01 13:49:25 compute-0 podman[160373]: 2025-10-01 13:49:25.690970063 +0000 UTC m=+0.042063815 image pull a742884d734e475a9ceb7e186a2d8775781675f700ff62f05c1b64d66e08b90f 38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest
Oct 01 13:49:25 compute-0 kernel: podman0: port 1(veth0) entered blocking state
Oct 01 13:49:25 compute-0 kernel: podman0: port 1(veth0) entered disabled state
Oct 01 13:49:25 compute-0 kernel: veth0: entered allmulticast mode
Oct 01 13:49:25 compute-0 kernel: veth0: entered promiscuous mode
Oct 01 13:49:25 compute-0 NetworkManager[51741]: <info>  [1759326565.7981] manager: (veth0): new Veth device (/org/freedesktop/NetworkManager/Devices/21)
Oct 01 13:49:25 compute-0 kernel: podman0: port 1(veth0) entered blocking state
Oct 01 13:49:25 compute-0 kernel: podman0: port 1(veth0) entered forwarding state
Oct 01 13:49:25 compute-0 NetworkManager[51741]: <info>  [1759326565.8036] device (veth0): carrier: link connected
Oct 01 13:49:25 compute-0 NetworkManager[51741]: <info>  [1759326565.8044] device (podman0): carrier: link connected
Oct 01 13:49:25 compute-0 systemd-udevd[160400]: Network interface NamePolicy= disabled on kernel command line.
Oct 01 13:49:25 compute-0 systemd-udevd[160403]: Network interface NamePolicy= disabled on kernel command line.
Oct 01 13:49:25 compute-0 NetworkManager[51741]: <info>  [1759326565.8531] device (podman0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 01 13:49:25 compute-0 NetworkManager[51741]: <info>  [1759326565.8544] device (podman0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Oct 01 13:49:25 compute-0 NetworkManager[51741]: <info>  [1759326565.8556] device (podman0): Activation: starting connection 'podman0' (7b11f6ee-d626-4066-a138-8a62dce65317)
Oct 01 13:49:25 compute-0 NetworkManager[51741]: <info>  [1759326565.8558] device (podman0): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Oct 01 13:49:25 compute-0 NetworkManager[51741]: <info>  [1759326565.8562] device (podman0): state change: prepare -> config (reason 'none', managed-type: 'external')
Oct 01 13:49:25 compute-0 NetworkManager[51741]: <info>  [1759326565.8568] device (podman0): state change: config -> ip-config (reason 'none', managed-type: 'external')
Oct 01 13:49:25 compute-0 NetworkManager[51741]: <info>  [1759326565.8571] device (podman0): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Oct 01 13:49:25 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct 01 13:49:25 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct 01 13:49:25 compute-0 NetworkManager[51741]: <info>  [1759326565.9004] device (podman0): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Oct 01 13:49:25 compute-0 NetworkManager[51741]: <info>  [1759326565.9020] device (podman0): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Oct 01 13:49:25 compute-0 NetworkManager[51741]: <info>  [1759326565.9028] device (podman0): Activation: successful, device activated.
Oct 01 13:49:25 compute-0 systemd[1]: iscsi.service: Unit cannot be reloaded because it is inactive.
Oct 01 13:49:26 compute-0 systemd[1]: Started libpod-conmon-853b16c83b351f8f021851440bef869891451bdf2369a39558d6763f654b49d8.scope.
Oct 01 13:49:26 compute-0 systemd[1]: Started libcrun container.
Oct 01 13:49:26 compute-0 podman[160373]: 2025-10-01 13:49:26.28300155 +0000 UTC m=+0.634095312 container init 853b16c83b351f8f021851440bef869891451bdf2369a39558d6763f654b49d8 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid_config, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Oct 01 13:49:26 compute-0 podman[160373]: 2025-10-01 13:49:26.297147958 +0000 UTC m=+0.648241620 container start 853b16c83b351f8f021851440bef869891451bdf2369a39558d6763f654b49d8 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid_config, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 01 13:49:26 compute-0 podman[160373]: 2025-10-01 13:49:26.301111717 +0000 UTC m=+0.652205409 container attach 853b16c83b351f8f021851440bef869891451bdf2369a39558d6763f654b49d8 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid_config, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 01 13:49:26 compute-0 iscsid_config[160532]: iqn.1994-05.com.redhat:84a2d33489a
Oct 01 13:49:26 compute-0 systemd[1]: libpod-853b16c83b351f8f021851440bef869891451bdf2369a39558d6763f654b49d8.scope: Deactivated successfully.
Oct 01 13:49:26 compute-0 conmon[160532]: conmon 853b16c83b351f8f0218 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-853b16c83b351f8f021851440bef869891451bdf2369a39558d6763f654b49d8.scope/container/memory.events
Oct 01 13:49:26 compute-0 podman[160373]: 2025-10-01 13:49:26.308558331 +0000 UTC m=+0.659652023 container died 853b16c83b351f8f021851440bef869891451bdf2369a39558d6763f654b49d8 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid_config, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20250930)
Oct 01 13:49:26 compute-0 kernel: podman0: port 1(veth0) entered disabled state
Oct 01 13:49:26 compute-0 kernel: veth0 (unregistering): left allmulticast mode
Oct 01 13:49:26 compute-0 kernel: veth0 (unregistering): left promiscuous mode
Oct 01 13:49:26 compute-0 kernel: podman0: port 1(veth0) entered disabled state
Oct 01 13:49:26 compute-0 NetworkManager[51741]: <info>  [1759326566.3779] device (podman0): state change: activated -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 01 13:49:26 compute-0 systemd[1]: run-netns-netns\x2d146fa6e0\x2dd8ea\x2d9420\x2dd055\x2d96045582d2f3.mount: Deactivated successfully.
Oct 01 13:49:26 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-853b16c83b351f8f021851440bef869891451bdf2369a39558d6763f654b49d8-userdata-shm.mount: Deactivated successfully.
Oct 01 13:49:26 compute-0 systemd[1]: var-lib-containers-storage-overlay-98161467d4b2cd2227168a3b485ecaf2c155fa90f238bab4829c8524e18347d1-merged.mount: Deactivated successfully.
Oct 01 13:49:26 compute-0 podman[160373]: 2025-10-01 13:49:26.762122481 +0000 UTC m=+1.113216173 container remove 853b16c83b351f8f021851440bef869891451bdf2369a39558d6763f654b49d8 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid_config, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 01 13:49:26 compute-0 python3.9[160335]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman run --name iscsid_config --detach=False --rm --tty=True 38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest /usr/sbin/iscsi-iname
Oct 01 13:49:26 compute-0 systemd[1]: libpod-conmon-853b16c83b351f8f021851440bef869891451bdf2369a39558d6763f654b49d8.scope: Deactivated successfully.
Oct 01 13:49:26 compute-0 python3.9[160335]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: Error generating systemd: 
                                             DEPRECATED command:
                                             It is recommended to use Quadlets for running containers and pods under systemd.
                                             
                                             Please refer to podman-systemd.unit(5) for details.
                                             Error: iscsid_config does not refer to a container or pod: no pod with name or ID iscsid_config found: no such pod: no container with name or ID "iscsid_config" found: no such container
Oct 01 13:49:26 compute-0 sudo[160333]: pam_unix(sudo:session): session closed for user root
Oct 01 13:49:27 compute-0 sudo[160774]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oyohrtlcjntggazlyslvzffjssmzddxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326567.1561127-218-221377320387943/AnsiballZ_stat.py'
Oct 01 13:49:27 compute-0 sudo[160774]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:49:27 compute-0 python3.9[160776]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:49:27 compute-0 sudo[160774]: pam_unix(sudo:session): session closed for user root
Oct 01 13:49:28 compute-0 sudo[160897]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aunhfrzybkowaybtyntfhaxfxzdhlvpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326567.1561127-218-221377320387943/AnsiballZ_copy.py'
Oct 01 13:49:28 compute-0 sudo[160897]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:49:28 compute-0 python3.9[160899]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759326567.1561127-218-221377320387943/.source.iscsi _original_basename=.zm8sm0eo follow=False checksum=714de5c475f58e9502527c4eae4792e034d47600 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:49:28 compute-0 sudo[160897]: pam_unix(sudo:session): session closed for user root
Oct 01 13:49:29 compute-0 sudo[161049]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utqbvpvtpuvmtuczfhdlpmhrgimkupqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326568.8857257-248-139182203046919/AnsiballZ_file.py'
Oct 01 13:49:29 compute-0 sudo[161049]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:49:29 compute-0 python3.9[161051]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:49:29 compute-0 sudo[161049]: pam_unix(sudo:session): session closed for user root
Oct 01 13:49:30 compute-0 python3.9[161201]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/iscsid.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 01 13:49:31 compute-0 sudo[161353]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spmpkprazqoowcfyuvlhzvqavaczbxvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326570.6138809-282-245789255200564/AnsiballZ_lineinfile.py'
Oct 01 13:49:31 compute-0 sudo[161353]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:49:31 compute-0 python3.9[161355]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:49:31 compute-0 sudo[161353]: pam_unix(sudo:session): session closed for user root
Oct 01 13:49:32 compute-0 sudo[161505]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvtcejthkiraydpholykwrdaczdepxwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326571.7687244-300-112427302740002/AnsiballZ_file.py'
Oct 01 13:49:32 compute-0 sudo[161505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:49:32 compute-0 python3.9[161507]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:49:32 compute-0 sudo[161505]: pam_unix(sudo:session): session closed for user root
Oct 01 13:49:33 compute-0 sudo[161657]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ochaysczkuewrgoxpfcwckaeqhddsnzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326572.65252-316-119892717623215/AnsiballZ_stat.py'
Oct 01 13:49:33 compute-0 sudo[161657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:49:33 compute-0 python3.9[161659]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:49:33 compute-0 sudo[161657]: pam_unix(sudo:session): session closed for user root
Oct 01 13:49:33 compute-0 sudo[161735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hehsoytcqrktlwgatzxwcossssnfeljv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326572.65252-316-119892717623215/AnsiballZ_file.py'
Oct 01 13:49:33 compute-0 sudo[161735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:49:33 compute-0 python3.9[161737]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:49:33 compute-0 sudo[161735]: pam_unix(sudo:session): session closed for user root
Oct 01 13:49:34 compute-0 sudo[161887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvbejilmgvgdulkccsomhnhldyixemmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326574.0524983-316-132323799191945/AnsiballZ_stat.py'
Oct 01 13:49:34 compute-0 sudo[161887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:49:34 compute-0 python3.9[161889]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:49:34 compute-0 sudo[161887]: pam_unix(sudo:session): session closed for user root
Oct 01 13:49:34 compute-0 sudo[161965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-msxtuvsxxakeotnknsemruzwmavthhtw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326574.0524983-316-132323799191945/AnsiballZ_file.py'
Oct 01 13:49:34 compute-0 sudo[161965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:49:35 compute-0 python3.9[161967]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:49:35 compute-0 sudo[161965]: pam_unix(sudo:session): session closed for user root
Oct 01 13:49:35 compute-0 sudo[162117]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ycjeebjbgwqxefmxvpmiqqqxtvcxaooa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326575.4399326-362-119987724378787/AnsiballZ_file.py'
Oct 01 13:49:35 compute-0 sudo[162117]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:49:36 compute-0 python3.9[162119]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:49:36 compute-0 sudo[162117]: pam_unix(sudo:session): session closed for user root
Oct 01 13:49:36 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct 01 13:49:36 compute-0 sudo[162269]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ocmfylsalaicykipmknxkawsdajqikxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326576.3193495-378-96544451296964/AnsiballZ_stat.py'
Oct 01 13:49:36 compute-0 sudo[162269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:49:36 compute-0 python3.9[162271]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:49:36 compute-0 sudo[162269]: pam_unix(sudo:session): session closed for user root
Oct 01 13:49:37 compute-0 sudo[162347]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oehfpiqqsbmzsnspujfutcpaqnjcwreb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326576.3193495-378-96544451296964/AnsiballZ_file.py'
Oct 01 13:49:37 compute-0 sudo[162347]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:49:37 compute-0 python3.9[162349]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:49:37 compute-0 sudo[162347]: pam_unix(sudo:session): session closed for user root
Oct 01 13:49:38 compute-0 sudo[162499]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jurpfodpyxsuuzgnnegzzaskzwuwrhqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326577.7379897-402-37717078517549/AnsiballZ_stat.py'
Oct 01 13:49:38 compute-0 sudo[162499]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:49:38 compute-0 python3.9[162501]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:49:38 compute-0 sudo[162499]: pam_unix(sudo:session): session closed for user root
Oct 01 13:49:38 compute-0 sudo[162577]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tzoglxvnfbkkozsyftcfzdoiuntyxxtv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326577.7379897-402-37717078517549/AnsiballZ_file.py'
Oct 01 13:49:38 compute-0 sudo[162577]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:49:39 compute-0 python3.9[162579]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:49:39 compute-0 sudo[162577]: pam_unix(sudo:session): session closed for user root
Oct 01 13:49:39 compute-0 sudo[162729]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ljehonlxwuwcszwzxabbfdtmfgqpnotq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326579.24584-426-78917160475815/AnsiballZ_systemd.py'
Oct 01 13:49:39 compute-0 sudo[162729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:49:40 compute-0 python3.9[162731]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 01 13:49:40 compute-0 systemd[1]: Reloading.
Oct 01 13:49:40 compute-0 systemd-rc-local-generator[162757]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:49:40 compute-0 systemd-sysv-generator[162761]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:49:40 compute-0 sudo[162729]: pam_unix(sudo:session): session closed for user root
Oct 01 13:49:41 compute-0 sudo[162918]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-szafpzkrrgnzmvzoijsbjzxwpurflqhk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326580.6404834-442-238661104315302/AnsiballZ_stat.py'
Oct 01 13:49:41 compute-0 sudo[162918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:49:41 compute-0 python3.9[162920]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:49:41 compute-0 sudo[162918]: pam_unix(sudo:session): session closed for user root
Oct 01 13:49:41 compute-0 sudo[162996]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xphtxgwfursqptkhypxfjrewqegxtbpc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326580.6404834-442-238661104315302/AnsiballZ_file.py'
Oct 01 13:49:41 compute-0 sudo[162996]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:49:41 compute-0 python3.9[162998]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:49:41 compute-0 sudo[162996]: pam_unix(sudo:session): session closed for user root
Oct 01 13:49:42 compute-0 sudo[163148]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fufhshgqeoxlosfrafvkrgnnwznqcjah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326582.0485878-466-176889901091137/AnsiballZ_stat.py'
Oct 01 13:49:42 compute-0 sudo[163148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:49:42 compute-0 python3.9[163150]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:49:42 compute-0 sudo[163148]: pam_unix(sudo:session): session closed for user root
Oct 01 13:49:42 compute-0 sudo[163226]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnervxnklsvpounugueseeuplpxdzfip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326582.0485878-466-176889901091137/AnsiballZ_file.py'
Oct 01 13:49:42 compute-0 sudo[163226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:49:43 compute-0 python3.9[163228]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:49:43 compute-0 sudo[163226]: pam_unix(sudo:session): session closed for user root
Oct 01 13:49:43 compute-0 sudo[163378]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfuadhskruyqpznkmzanmeywnjqsjasg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326583.4150195-490-97717438290158/AnsiballZ_systemd.py'
Oct 01 13:49:43 compute-0 sudo[163378]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:49:44 compute-0 python3.9[163380]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 01 13:49:44 compute-0 systemd[1]: Reloading.
Oct 01 13:49:44 compute-0 systemd-rc-local-generator[163405]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:49:44 compute-0 systemd-sysv-generator[163410]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:49:44 compute-0 systemd[1]: Starting Create netns directory...
Oct 01 13:49:44 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct 01 13:49:44 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct 01 13:49:44 compute-0 systemd[1]: Finished Create netns directory.
Oct 01 13:49:44 compute-0 sudo[163378]: pam_unix(sudo:session): session closed for user root
Oct 01 13:49:45 compute-0 sudo[163571]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ulllwrwjrmxpuyyvvjyeityyhypwytvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326584.9429116-510-47372186232600/AnsiballZ_file.py'
Oct 01 13:49:45 compute-0 sudo[163571]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:49:45 compute-0 python3.9[163573]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:49:45 compute-0 sudo[163571]: pam_unix(sudo:session): session closed for user root
Oct 01 13:49:46 compute-0 sudo[163723]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkdtecauvrunbttboyrgyglgqlqmighw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326585.8404417-526-706953689738/AnsiballZ_stat.py'
Oct 01 13:49:46 compute-0 sudo[163723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:49:46 compute-0 python3.9[163725]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/iscsid/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:49:46 compute-0 sudo[163723]: pam_unix(sudo:session): session closed for user root
Oct 01 13:49:46 compute-0 sudo[163846]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spmnxpwtkuojwskijisyxbmabipmwdse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326585.8404417-526-706953689738/AnsiballZ_copy.py'
Oct 01 13:49:46 compute-0 sudo[163846]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:49:47 compute-0 python3.9[163848]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/iscsid/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759326585.8404417-526-706953689738/.source _original_basename=healthcheck follow=False checksum=2e1237e7fe015c809b173c52e24cfb87132f4344 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:49:47 compute-0 sudo[163846]: pam_unix(sudo:session): session closed for user root
Oct 01 13:49:48 compute-0 sudo[164024]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-focktbqcnwckevdiprysiqnwoeddtwmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326587.6232414-560-167748528063638/AnsiballZ_file.py'
Oct 01 13:49:48 compute-0 sudo[164024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:49:48 compute-0 podman[163972]: 2025-10-01 13:49:48.046051642 +0000 UTC m=+0.089840855 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct 01 13:49:48 compute-0 podman[163973]: 2025-10-01 13:49:48.094684826 +0000 UTC m=+0.136444933 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4)
Oct 01 13:49:48 compute-0 python3.9[164038]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:49:48 compute-0 sudo[164024]: pam_unix(sudo:session): session closed for user root
Oct 01 13:49:48 compute-0 sudo[164195]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nttbqoiiihfyqrpukdysqiceoyupzbnc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326588.4991808-576-223350937169945/AnsiballZ_stat.py'
Oct 01 13:49:48 compute-0 sudo[164195]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:49:49 compute-0 python3.9[164197]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/iscsid.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:49:49 compute-0 sudo[164195]: pam_unix(sudo:session): session closed for user root
Oct 01 13:49:49 compute-0 sudo[164318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bbfosizaehsbrrjiirqomssrfceqkiwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326588.4991808-576-223350937169945/AnsiballZ_copy.py'
Oct 01 13:49:49 compute-0 sudo[164318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:49:49 compute-0 python3.9[164320]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/iscsid.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759326588.4991808-576-223350937169945/.source.json _original_basename=.ztmkv1td follow=False checksum=80e4f97460718c7e5c66b21ef8b846eba0e0dbc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:49:49 compute-0 sudo[164318]: pam_unix(sudo:session): session closed for user root
Oct 01 13:49:50 compute-0 sudo[164470]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pcaruxhrrdkrxmmnogekbyxqsrvpdcoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326589.9817107-606-217932371305196/AnsiballZ_file.py'
Oct 01 13:49:50 compute-0 sudo[164470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:49:50 compute-0 python3.9[164472]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/iscsid state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:49:50 compute-0 sudo[164470]: pam_unix(sudo:session): session closed for user root
Oct 01 13:49:51 compute-0 sudo[164622]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvajskldviialfnwnuxettyufdckwvpx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326590.8413565-622-249272301215453/AnsiballZ_stat.py'
Oct 01 13:49:51 compute-0 sudo[164622]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:49:51 compute-0 sudo[164622]: pam_unix(sudo:session): session closed for user root
Oct 01 13:49:51 compute-0 sudo[164745]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vnbdusmoesdzlqpfabyggwixibplkhpd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326590.8413565-622-249272301215453/AnsiballZ_copy.py'
Oct 01 13:49:51 compute-0 sudo[164745]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:49:52 compute-0 sudo[164745]: pam_unix(sudo:session): session closed for user root
Oct 01 13:49:53 compute-0 sudo[164897]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fquyhjcluipqueuqbyhuilyqrsstsoot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326592.606421-656-96025841465895/AnsiballZ_container_config_data.py'
Oct 01 13:49:53 compute-0 sudo[164897]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:49:53 compute-0 python3.9[164899]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/iscsid config_pattern=*.json debug=False
Oct 01 13:49:53 compute-0 sudo[164897]: pam_unix(sudo:session): session closed for user root
Oct 01 13:49:54 compute-0 sudo[165049]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ulignwdcleodwucvfznlhftbikuqgyfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326593.621947-674-44313326498446/AnsiballZ_container_config_hash.py'
Oct 01 13:49:54 compute-0 sudo[165049]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:49:54 compute-0 python3.9[165051]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 01 13:49:54 compute-0 sudo[165049]: pam_unix(sudo:session): session closed for user root
Oct 01 13:49:55 compute-0 sudo[165201]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kxmzzqkmgsufzjnpffgdznqpvtyfpweo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326594.7290647-692-162178064858903/AnsiballZ_podman_container_info.py'
Oct 01 13:49:55 compute-0 sudo[165201]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:49:55 compute-0 python3.9[165203]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct 01 13:49:55 compute-0 sudo[165201]: pam_unix(sudo:session): session closed for user root
Oct 01 13:49:56 compute-0 sudo[165379]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilauggqoubhwxbvmbizjncwgybktmvjd ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759326596.2834013-718-144443223382609/AnsiballZ_edpm_container_manage.py'
Oct 01 13:49:56 compute-0 sudo[165379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:49:57 compute-0 python3[165381]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/iscsid config_id=iscsid config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct 01 13:49:57 compute-0 podman[165419]: 2025-10-01 13:49:57.471719209 +0000 UTC m=+0.056540622 container create 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest)
Oct 01 13:49:57 compute-0 podman[165419]: 2025-10-01 13:49:57.439798704 +0000 UTC m=+0.024620147 image pull a742884d734e475a9ceb7e186a2d8775781675f700ff62f05c1b64d66e08b90f 38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest
Oct 01 13:49:57 compute-0 python3[165381]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name iscsid --conmon-pidfile /run/iscsid.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=iscsid --label container_name=iscsid --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run:/run --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:z --volume /etc/target:/etc/target:z --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /var/lib/openstack/healthchecks/iscsid:/openstack:ro,z 38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest
Oct 01 13:49:57 compute-0 sudo[165379]: pam_unix(sudo:session): session closed for user root
Oct 01 13:49:58 compute-0 sudo[165607]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlqeixyjxvqouaeumuusejcwpolruypz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326597.8755114-734-38030611248044/AnsiballZ_stat.py'
Oct 01 13:49:58 compute-0 sudo[165607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:49:58 compute-0 python3.9[165609]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 01 13:49:58 compute-0 sudo[165607]: pam_unix(sudo:session): session closed for user root
Oct 01 13:49:59 compute-0 sudo[165761]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dipdijjgthaymjpijnkrncftotdaqvja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326598.7900927-752-193775957800017/AnsiballZ_file.py'
Oct 01 13:49:59 compute-0 sudo[165761]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:49:59 compute-0 python3.9[165763]: ansible-file Invoked with path=/etc/systemd/system/edpm_iscsid.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:49:59 compute-0 sudo[165761]: pam_unix(sudo:session): session closed for user root
Oct 01 13:49:59 compute-0 sudo[165837]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwanagxubwtcbbkojjmxyuuvfrvyaaqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326598.7900927-752-193775957800017/AnsiballZ_stat.py'
Oct 01 13:49:59 compute-0 sudo[165837]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:49:59 compute-0 python3.9[165839]: ansible-stat Invoked with path=/etc/systemd/system/edpm_iscsid_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 01 13:49:59 compute-0 sudo[165837]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:00 compute-0 sudo[165988]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrlwiktuuonmnkaopgrufzonszsmuzeq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326599.950164-752-67046050178548/AnsiballZ_copy.py'
Oct 01 13:50:00 compute-0 sudo[165988]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:50:00 compute-0 python3.9[165990]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759326599.950164-752-67046050178548/source dest=/etc/systemd/system/edpm_iscsid.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:50:00 compute-0 sudo[165988]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:00 compute-0 sudo[166064]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmyeztrsxszkqzlrsxgesefrxyblodqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326599.950164-752-67046050178548/AnsiballZ_systemd.py'
Oct 01 13:50:00 compute-0 sudo[166064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:50:01 compute-0 python3.9[166066]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 01 13:50:01 compute-0 systemd[1]: Reloading.
Oct 01 13:50:01 compute-0 systemd-rc-local-generator[166094]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:50:01 compute-0 systemd-sysv-generator[166099]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:50:01 compute-0 sudo[166064]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:01 compute-0 sudo[166175]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivnkfxboaqjyuecrbdzqhgrrorhfuxsh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326599.950164-752-67046050178548/AnsiballZ_systemd.py'
Oct 01 13:50:01 compute-0 sudo[166175]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:50:02 compute-0 python3.9[166177]: ansible-systemd Invoked with state=restarted name=edpm_iscsid.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 01 13:50:02 compute-0 systemd[1]: Reloading.
Oct 01 13:50:02 compute-0 systemd-rc-local-generator[166202]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:50:02 compute-0 systemd-sysv-generator[166207]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:50:02 compute-0 systemd[1]: Starting iscsid container...
Oct 01 13:50:02 compute-0 systemd[1]: Started libcrun container.
Oct 01 13:50:02 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0f141d5cd345c36427ebc76af19ccffffb76578bc59bae197fdc9e0a5b582c09/merged/etc/target supports timestamps until 2038 (0x7fffffff)
Oct 01 13:50:02 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0f141d5cd345c36427ebc76af19ccffffb76578bc59bae197fdc9e0a5b582c09/merged/etc/iscsi supports timestamps until 2038 (0x7fffffff)
Oct 01 13:50:02 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0f141d5cd345c36427ebc76af19ccffffb76578bc59bae197fdc9e0a5b582c09/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct 01 13:50:02 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9.
Oct 01 13:50:02 compute-0 podman[166217]: 2025-10-01 13:50:02.748637544 +0000 UTC m=+0.157667295 container init 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, org.label-schema.build-date=20250930, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct 01 13:50:02 compute-0 iscsid[166232]: + sudo -E kolla_set_configs
Oct 01 13:50:02 compute-0 sudo[166238]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Oct 01 13:50:02 compute-0 podman[166217]: 2025-10-01 13:50:02.794974675 +0000 UTC m=+0.204004376 container start 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, config_id=iscsid)
Oct 01 13:50:02 compute-0 podman[166217]: iscsid
Oct 01 13:50:02 compute-0 systemd[1]: Started iscsid container.
Oct 01 13:50:02 compute-0 systemd[1]: Created slice User Slice of UID 0.
Oct 01 13:50:02 compute-0 systemd[1]: Starting User Runtime Directory /run/user/0...
Oct 01 13:50:02 compute-0 sudo[166175]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:02 compute-0 systemd[1]: Finished User Runtime Directory /run/user/0.
Oct 01 13:50:02 compute-0 systemd[1]: Starting User Manager for UID 0...
Oct 01 13:50:02 compute-0 systemd[166251]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Oct 01 13:50:02 compute-0 podman[166239]: 2025-10-01 13:50:02.906677108 +0000 UTC m=+0.090757430 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=starting, health_failing_streak=1, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, container_name=iscsid, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 01 13:50:02 compute-0 systemd[1]: 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9-24116049c4bff267.service: Main process exited, code=exited, status=1/FAILURE
Oct 01 13:50:02 compute-0 systemd[1]: 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9-24116049c4bff267.service: Failed with result 'exit-code'.
Oct 01 13:50:02 compute-0 systemd[166251]: Queued start job for default target Main User Target.
Oct 01 13:50:03 compute-0 systemd[166251]: Created slice User Application Slice.
Oct 01 13:50:03 compute-0 systemd[166251]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Oct 01 13:50:03 compute-0 systemd[166251]: Started Daily Cleanup of User's Temporary Directories.
Oct 01 13:50:03 compute-0 systemd[166251]: Reached target Paths.
Oct 01 13:50:03 compute-0 systemd[166251]: Reached target Timers.
Oct 01 13:50:03 compute-0 systemd[166251]: Starting D-Bus User Message Bus Socket...
Oct 01 13:50:03 compute-0 systemd[166251]: Starting Create User's Volatile Files and Directories...
Oct 01 13:50:03 compute-0 systemd[166251]: Listening on D-Bus User Message Bus Socket.
Oct 01 13:50:03 compute-0 systemd[166251]: Reached target Sockets.
Oct 01 13:50:03 compute-0 systemd[166251]: Finished Create User's Volatile Files and Directories.
Oct 01 13:50:03 compute-0 systemd[166251]: Reached target Basic System.
Oct 01 13:50:03 compute-0 systemd[166251]: Reached target Main User Target.
Oct 01 13:50:03 compute-0 systemd[166251]: Startup finished in 135ms.
Oct 01 13:50:03 compute-0 systemd[1]: Started User Manager for UID 0.
Oct 01 13:50:03 compute-0 systemd[1]: Started Session c3 of User root.
Oct 01 13:50:03 compute-0 sudo[166238]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Oct 01 13:50:03 compute-0 iscsid[166232]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 01 13:50:03 compute-0 iscsid[166232]: INFO:__main__:Validating config file
Oct 01 13:50:03 compute-0 iscsid[166232]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 01 13:50:03 compute-0 iscsid[166232]: INFO:__main__:Writing out command to execute
Oct 01 13:50:03 compute-0 sudo[166238]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:03 compute-0 systemd[1]: session-c3.scope: Deactivated successfully.
Oct 01 13:50:03 compute-0 iscsid[166232]: ++ cat /run_command
Oct 01 13:50:03 compute-0 iscsid[166232]: + CMD='/usr/sbin/iscsid -f'
Oct 01 13:50:03 compute-0 iscsid[166232]: + ARGS=
Oct 01 13:50:03 compute-0 iscsid[166232]: + sudo kolla_copy_cacerts
Oct 01 13:50:03 compute-0 sudo[166346]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Oct 01 13:50:03 compute-0 systemd[1]: Started Session c4 of User root.
Oct 01 13:50:03 compute-0 sudo[166346]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Oct 01 13:50:03 compute-0 sudo[166346]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:03 compute-0 systemd[1]: session-c4.scope: Deactivated successfully.
Oct 01 13:50:03 compute-0 iscsid[166232]: + [[ ! -n '' ]]
Oct 01 13:50:03 compute-0 iscsid[166232]: + . kolla_extend_start
Oct 01 13:50:03 compute-0 iscsid[166232]: ++ [[ ! -f /etc/iscsi/initiatorname.iscsi ]]
Oct 01 13:50:03 compute-0 iscsid[166232]: Running command: '/usr/sbin/iscsid -f'
Oct 01 13:50:03 compute-0 iscsid[166232]: + echo 'Running command: '\''/usr/sbin/iscsid -f'\'''
Oct 01 13:50:03 compute-0 iscsid[166232]: + umask 0022
Oct 01 13:50:03 compute-0 iscsid[166232]: + exec /usr/sbin/iscsid -f
Oct 01 13:50:03 compute-0 kernel: Loading iSCSI transport class v2.0-870.
Oct 01 13:50:03 compute-0 python3.9[166436]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.iscsid_restart_required follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 01 13:50:04 compute-0 sudo[166586]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxxbdxkpnpmjndqsvcdygmfqpmspnunt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326603.8666937-826-191269079479581/AnsiballZ_file.py'
Oct 01 13:50:04 compute-0 sudo[166586]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:50:04 compute-0 python3.9[166588]: ansible-ansible.builtin.file Invoked with path=/etc/iscsi/.iscsid_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:50:04 compute-0 sudo[166586]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:05 compute-0 sudo[166738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qceufvatxhhdfkrusdjcrquvmzrxnwvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326604.8401716-848-260978592304439/AnsiballZ_service_facts.py'
Oct 01 13:50:05 compute-0 sudo[166738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:50:05 compute-0 python3.9[166740]: ansible-ansible.builtin.service_facts Invoked
Oct 01 13:50:05 compute-0 network[166757]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 01 13:50:05 compute-0 network[166758]: 'network-scripts' will be removed from distribution in near future.
Oct 01 13:50:05 compute-0 network[166759]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 01 13:50:09 compute-0 sudo[166738]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:10 compute-0 sudo[167031]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uajgyhzjwdmmhruszfulcqqnsvkpsiky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326609.9401228-868-175690235048349/AnsiballZ_file.py'
Oct 01 13:50:10 compute-0 sudo[167031]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:50:10 compute-0 python3.9[167033]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct 01 13:50:10 compute-0 sudo[167031]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:11 compute-0 sudo[167183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjedhcvkfvmxynwpyecpnbuyvmqpaveb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326610.6677268-884-239675134053572/AnsiballZ_modprobe.py'
Oct 01 13:50:11 compute-0 sudo[167183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:50:11 compute-0 python3.9[167185]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Oct 01 13:50:11 compute-0 sudo[167183]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:12 compute-0 sudo[167339]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eyetibeuwhxaftkhcwqqxqipvyafreri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326611.6559944-900-91324679606162/AnsiballZ_stat.py'
Oct 01 13:50:12 compute-0 sudo[167339]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:50:12 compute-0 python3.9[167341]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:50:12 compute-0 sudo[167339]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:12 compute-0 sudo[167462]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eoflhcezbwlsmynupkhpgtzmfafgjjcm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326611.6559944-900-91324679606162/AnsiballZ_copy.py'
Oct 01 13:50:12 compute-0 sudo[167462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:50:13 compute-0 python3.9[167464]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759326611.6559944-900-91324679606162/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:50:13 compute-0 sudo[167462]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:13 compute-0 systemd[1]: Stopping User Manager for UID 0...
Oct 01 13:50:13 compute-0 systemd[166251]: Activating special unit Exit the Session...
Oct 01 13:50:13 compute-0 systemd[166251]: Stopped target Main User Target.
Oct 01 13:50:13 compute-0 systemd[166251]: Stopped target Basic System.
Oct 01 13:50:13 compute-0 systemd[166251]: Stopped target Paths.
Oct 01 13:50:13 compute-0 systemd[166251]: Stopped target Sockets.
Oct 01 13:50:13 compute-0 systemd[166251]: Stopped target Timers.
Oct 01 13:50:13 compute-0 systemd[166251]: Stopped Daily Cleanup of User's Temporary Directories.
Oct 01 13:50:13 compute-0 systemd[166251]: Closed D-Bus User Message Bus Socket.
Oct 01 13:50:13 compute-0 systemd[166251]: Stopped Create User's Volatile Files and Directories.
Oct 01 13:50:13 compute-0 systemd[166251]: Removed slice User Application Slice.
Oct 01 13:50:13 compute-0 systemd[166251]: Reached target Shutdown.
Oct 01 13:50:13 compute-0 systemd[166251]: Finished Exit the Session.
Oct 01 13:50:13 compute-0 systemd[166251]: Reached target Exit the Session.
Oct 01 13:50:13 compute-0 systemd[1]: user@0.service: Deactivated successfully.
Oct 01 13:50:13 compute-0 systemd[1]: Stopped User Manager for UID 0.
Oct 01 13:50:13 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/0...
Oct 01 13:50:13 compute-0 systemd[1]: run-user-0.mount: Deactivated successfully.
Oct 01 13:50:13 compute-0 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Oct 01 13:50:13 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/0.
Oct 01 13:50:13 compute-0 systemd[1]: Removed slice User Slice of UID 0.
Oct 01 13:50:13 compute-0 sudo[167616]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cudjnxtipkkiwmpnlppbsdmueenttwfa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326613.3622544-932-198388379830866/AnsiballZ_lineinfile.py'
Oct 01 13:50:13 compute-0 sudo[167616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:50:13 compute-0 python3.9[167618]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:50:13 compute-0 sudo[167616]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:50:14.194 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 13:50:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:50:14.196 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 13:50:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:50:14.196 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 13:50:14 compute-0 sudo[167769]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrmfnovuwvtwgxhzdmgohjpbjtxkoqhl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326614.1406198-948-215042906019672/AnsiballZ_systemd.py'
Oct 01 13:50:14 compute-0 sudo[167769]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:50:14 compute-0 python3.9[167771]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 01 13:50:14 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Oct 01 13:50:14 compute-0 systemd[1]: Stopped Load Kernel Modules.
Oct 01 13:50:14 compute-0 systemd[1]: Stopping Load Kernel Modules...
Oct 01 13:50:14 compute-0 systemd[1]: Starting Load Kernel Modules...
Oct 01 13:50:14 compute-0 systemd[1]: Finished Load Kernel Modules.
Oct 01 13:50:14 compute-0 sudo[167769]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:15 compute-0 sudo[167925]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qbmncttkqjgzsjbcqyszwwqfhqayelwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326615.2002034-964-24168166483779/AnsiballZ_file.py'
Oct 01 13:50:15 compute-0 sudo[167925]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:50:15 compute-0 python3.9[167927]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:50:15 compute-0 sudo[167925]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:16 compute-0 sudo[168077]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyacxjbcaaqvonrvfybpetlosxrcyeqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326616.1203644-982-147417861014900/AnsiballZ_stat.py'
Oct 01 13:50:16 compute-0 sudo[168077]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:50:16 compute-0 python3.9[168079]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 01 13:50:16 compute-0 sudo[168077]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:17 compute-0 sudo[168229]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mibsnvhvtcuhiuvwfuclfpvhxzfyfedw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326617.172888-1000-120169038758350/AnsiballZ_stat.py'
Oct 01 13:50:17 compute-0 sudo[168229]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:50:17 compute-0 python3.9[168231]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 01 13:50:17 compute-0 sudo[168229]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:18 compute-0 sudo[168412]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngbpnrtbhkkslzcuihzwdlcripfgzkbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326617.9859266-1016-121217552108849/AnsiballZ_stat.py'
Oct 01 13:50:18 compute-0 sudo[168412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:50:18 compute-0 podman[168355]: 2025-10-01 13:50:18.421813046 +0000 UTC m=+0.113928444 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4)
Oct 01 13:50:18 compute-0 podman[168356]: 2025-10-01 13:50:18.453029521 +0000 UTC m=+0.140869366 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, container_name=ovn_controller, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Oct 01 13:50:18 compute-0 python3.9[168422]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:50:18 compute-0 sudo[168412]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:19 compute-0 sudo[168548]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgdilawhjxdzkgwvfuclnqsgtjmgnhyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326617.9859266-1016-121217552108849/AnsiballZ_copy.py'
Oct 01 13:50:19 compute-0 sudo[168548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:50:19 compute-0 python3.9[168550]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759326617.9859266-1016-121217552108849/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:50:19 compute-0 sudo[168548]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:20 compute-0 sudo[168700]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trginmiwlwselmaisesjhnhcgultoqok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326619.6956298-1046-157021742029271/AnsiballZ_command.py'
Oct 01 13:50:20 compute-0 sudo[168700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:50:20 compute-0 python3.9[168702]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:50:20 compute-0 sudo[168700]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:21 compute-0 sudo[168853]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytlxduxfqeshuetuggghdftqcsrppqty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326620.704445-1062-160738539601041/AnsiballZ_lineinfile.py'
Oct 01 13:50:21 compute-0 sudo[168853]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:50:21 compute-0 python3.9[168855]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:50:21 compute-0 sudo[168853]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:22 compute-0 sudo[169005]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kovlelnklhyrfdzoefqgtyfwoowrbppq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326621.5127604-1078-88513840239885/AnsiballZ_replace.py'
Oct 01 13:50:22 compute-0 sudo[169005]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:50:22 compute-0 python3.9[169007]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:50:22 compute-0 sudo[169005]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:22 compute-0 sudo[169157]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhcorjcufxpmmyenabzytzjxcdeeyhzm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326622.5366702-1094-219813411131280/AnsiballZ_replace.py'
Oct 01 13:50:22 compute-0 sudo[169157]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:50:23 compute-0 python3.9[169159]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:50:23 compute-0 sudo[169157]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:23 compute-0 sudo[169309]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxbqbjsuenvhhaathxvbqsxnzrhywnru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326623.4216485-1112-174963883636077/AnsiballZ_lineinfile.py'
Oct 01 13:50:23 compute-0 sudo[169309]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:50:23 compute-0 python3.9[169311]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:50:23 compute-0 systemd[1]: virtnodedevd.service: Deactivated successfully.
Oct 01 13:50:23 compute-0 sudo[169309]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:24 compute-0 sudo[169462]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjcaxcpxddnsrkpfcwiyjsutmkonayrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326624.1925917-1112-153193931245940/AnsiballZ_lineinfile.py'
Oct 01 13:50:24 compute-0 sudo[169462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:50:24 compute-0 python3.9[169464]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:50:24 compute-0 sudo[169462]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:25 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Oct 01 13:50:25 compute-0 sudo[169615]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sfebgaucxvcsbuitbxgdilhahpsjgabz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326625.1118116-1112-164898061739587/AnsiballZ_lineinfile.py'
Oct 01 13:50:25 compute-0 sudo[169615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:50:25 compute-0 python3.9[169617]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:50:25 compute-0 sudo[169615]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:26 compute-0 sudo[169767]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-znywadjmoqhidcmngdlakdmogsyuqtph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326625.9745765-1112-219242574571304/AnsiballZ_lineinfile.py'
Oct 01 13:50:26 compute-0 sudo[169767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:50:26 compute-0 python3.9[169769]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:50:26 compute-0 sudo[169767]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:26 compute-0 systemd[1]: virtqemud.service: Deactivated successfully.
Oct 01 13:50:27 compute-0 sudo[169920]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukhpiinqzdngkslcwgyzrrdfufygmxgj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326626.8026657-1170-252551101104736/AnsiballZ_stat.py'
Oct 01 13:50:27 compute-0 sudo[169920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:50:27 compute-0 python3.9[169922]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 01 13:50:27 compute-0 sudo[169920]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:27 compute-0 sudo[170074]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngiqaetlgzdykhycvrxsxtlaoxqflyzs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326627.5922883-1186-84322767547442/AnsiballZ_file.py'
Oct 01 13:50:27 compute-0 sudo[170074]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:50:27 compute-0 systemd[1]: virtsecretd.service: Deactivated successfully.
Oct 01 13:50:28 compute-0 python3.9[170076]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:50:28 compute-0 sudo[170074]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:28 compute-0 sudo[170227]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eblkzvifzqdztwashkgysosgvpjwqqng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326628.4643683-1204-168159511230345/AnsiballZ_file.py'
Oct 01 13:50:28 compute-0 sudo[170227]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:50:29 compute-0 python3.9[170229]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:50:29 compute-0 sudo[170227]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:29 compute-0 sudo[170379]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqaayeaqhtxhurdevdfjmcalshzoaqxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326629.2865937-1220-260272405879014/AnsiballZ_stat.py'
Oct 01 13:50:29 compute-0 sudo[170379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:50:29 compute-0 python3.9[170381]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:50:29 compute-0 sudo[170379]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:30 compute-0 sudo[170457]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-holaukxchpeuaurknfxxvvwnmlcsvaid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326629.2865937-1220-260272405879014/AnsiballZ_file.py'
Oct 01 13:50:30 compute-0 sudo[170457]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:50:30 compute-0 python3.9[170459]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:50:30 compute-0 sudo[170457]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:30 compute-0 sudo[170609]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwueyrrqlppgemezrpufyjvavrkizsla ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326630.6124012-1220-230127916706612/AnsiballZ_stat.py'
Oct 01 13:50:30 compute-0 sudo[170609]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:50:31 compute-0 python3.9[170611]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:50:31 compute-0 sudo[170609]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:31 compute-0 sudo[170687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bdogugyeeuidvpcxoqowxamfqrlprxle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326630.6124012-1220-230127916706612/AnsiballZ_file.py'
Oct 01 13:50:31 compute-0 sudo[170687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:50:31 compute-0 python3.9[170689]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:50:31 compute-0 sudo[170687]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:32 compute-0 sudo[170839]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-psibrgvwuzuqjnszkbmbokuvctuqqbfc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326632.0243602-1266-66352630342634/AnsiballZ_file.py'
Oct 01 13:50:32 compute-0 sudo[170839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:50:32 compute-0 python3.9[170841]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:50:32 compute-0 sudo[170839]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:33 compute-0 podman[170943]: 2025-10-01 13:50:33.19285387 +0000 UTC m=+0.096463792 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, container_name=iscsid, io.buildah.version=1.41.4)
Oct 01 13:50:33 compute-0 sudo[171011]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hilozffzevowdyddqgciaxzitnsfvpos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326632.8883696-1282-101844611908690/AnsiballZ_stat.py'
Oct 01 13:50:33 compute-0 sudo[171011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:50:33 compute-0 python3.9[171013]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:50:33 compute-0 sudo[171011]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:33 compute-0 sudo[171089]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwdwehrmeklhcivnbxuavazosjbugkyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326632.8883696-1282-101844611908690/AnsiballZ_file.py'
Oct 01 13:50:33 compute-0 sudo[171089]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:50:34 compute-0 python3.9[171091]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:50:34 compute-0 sudo[171089]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:34 compute-0 sudo[171241]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kabnhwqllouqjutyvlilzelostjqyzym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326634.2114995-1306-235067855418202/AnsiballZ_stat.py'
Oct 01 13:50:34 compute-0 sudo[171241]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:50:34 compute-0 python3.9[171243]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:50:34 compute-0 sudo[171241]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:35 compute-0 sudo[171319]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yaosugrytoislgtnvkzbfqetgvpmzyhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326634.2114995-1306-235067855418202/AnsiballZ_file.py'
Oct 01 13:50:35 compute-0 sudo[171319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:50:35 compute-0 python3.9[171321]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:50:35 compute-0 sudo[171319]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:36 compute-0 sudo[171471]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejkgwumoorlnwfstgpnoiymmuseovwpd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326635.7698047-1330-111745094651620/AnsiballZ_systemd.py'
Oct 01 13:50:36 compute-0 sudo[171471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:50:36 compute-0 python3.9[171473]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 01 13:50:36 compute-0 systemd[1]: Reloading.
Oct 01 13:50:36 compute-0 systemd-rc-local-generator[171499]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:50:36 compute-0 systemd-sysv-generator[171506]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:50:36 compute-0 sudo[171471]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:37 compute-0 sudo[171661]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bsffnpjmeolgjfrbeciaupjbceeuxvlb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326637.1769068-1346-139168176160155/AnsiballZ_stat.py'
Oct 01 13:50:37 compute-0 sudo[171661]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:50:37 compute-0 python3.9[171663]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:50:37 compute-0 sudo[171661]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:38 compute-0 sudo[171739]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cimujjiwfgxatioqkxwlabeqtnurtwua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326637.1769068-1346-139168176160155/AnsiballZ_file.py'
Oct 01 13:50:38 compute-0 sudo[171739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:50:38 compute-0 python3.9[171741]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:50:38 compute-0 sudo[171739]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:39 compute-0 sudo[171891]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njopbjracrshlzmpfichkrhznkptxlvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326638.6607256-1370-49741087452553/AnsiballZ_stat.py'
Oct 01 13:50:39 compute-0 sudo[171891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:50:39 compute-0 python3.9[171893]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:50:39 compute-0 sudo[171891]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:39 compute-0 sudo[171969]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-boozjzjhmdylubtdleponmfxghdixfps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326638.6607256-1370-49741087452553/AnsiballZ_file.py'
Oct 01 13:50:39 compute-0 sudo[171969]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:50:39 compute-0 python3.9[171971]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:50:39 compute-0 sudo[171969]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:40 compute-0 sudo[172121]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pmpsgdxlsoqjlqyxodnldraoazffxpxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326640.2795556-1394-97491857790422/AnsiballZ_systemd.py'
Oct 01 13:50:40 compute-0 sudo[172121]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:50:40 compute-0 python3.9[172123]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 01 13:50:40 compute-0 systemd[1]: Reloading.
Oct 01 13:50:41 compute-0 systemd-rc-local-generator[172151]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:50:41 compute-0 systemd-sysv-generator[172155]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:50:41 compute-0 systemd[1]: Starting Create netns directory...
Oct 01 13:50:41 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct 01 13:50:41 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct 01 13:50:41 compute-0 systemd[1]: Finished Create netns directory.
Oct 01 13:50:41 compute-0 sudo[172121]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:42 compute-0 sudo[172315]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqejvmozbpnbyhvbytcvvnnwhmwubjds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326641.8126304-1414-123914473726940/AnsiballZ_file.py'
Oct 01 13:50:42 compute-0 sudo[172315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:50:42 compute-0 python3.9[172317]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:50:42 compute-0 sudo[172315]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:43 compute-0 sudo[172467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwhrjgodnhepvxfzxxbcucezqclyaspt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326642.6800113-1430-158246530531729/AnsiballZ_stat.py'
Oct 01 13:50:43 compute-0 sudo[172467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:50:43 compute-0 python3.9[172469]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:50:43 compute-0 sudo[172467]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:43 compute-0 sudo[172590]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xfagtkipofwyecgvslkdorhbbttriuyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326642.6800113-1430-158246530531729/AnsiballZ_copy.py'
Oct 01 13:50:43 compute-0 sudo[172590]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:50:44 compute-0 python3.9[172592]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759326642.6800113-1430-158246530531729/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:50:44 compute-0 sudo[172590]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:44 compute-0 sudo[172742]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-muoycwfxtcbgcgcbriswygmfbsbivzjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326644.6129856-1464-239283525492391/AnsiballZ_file.py'
Oct 01 13:50:44 compute-0 sudo[172742]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:50:45 compute-0 python3.9[172744]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:50:45 compute-0 sudo[172742]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:45 compute-0 sudo[172894]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blcklnqopywzibvyqdezpyxwiwyoubge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326645.5908768-1480-189552914776758/AnsiballZ_stat.py'
Oct 01 13:50:45 compute-0 sudo[172894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:50:46 compute-0 python3.9[172896]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:50:46 compute-0 sudo[172894]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:46 compute-0 sudo[173017]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spjjaizmjnloeysejfogudtlkbdybjcb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326645.5908768-1480-189552914776758/AnsiballZ_copy.py'
Oct 01 13:50:46 compute-0 sudo[173017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:50:46 compute-0 python3.9[173019]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759326645.5908768-1480-189552914776758/.source.json _original_basename=.lip8aijg follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:50:46 compute-0 sudo[173017]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:47 compute-0 sudo[173169]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-degffajctlqqncagnnwvqtxsmqhvlyfu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326646.9969766-1510-51070936711182/AnsiballZ_file.py'
Oct 01 13:50:47 compute-0 sudo[173169]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:50:47 compute-0 python3.9[173171]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:50:47 compute-0 sudo[173169]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:48 compute-0 sudo[173321]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwpahlpomjeantgqizmtnejlsvcyqcit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326647.7797475-1526-241406336582405/AnsiballZ_stat.py'
Oct 01 13:50:48 compute-0 sudo[173321]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:50:48 compute-0 sudo[173321]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:48 compute-0 sudo[173468]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nawlqcdxllihpofsgjmvmuctgrricydp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326647.7797475-1526-241406336582405/AnsiballZ_copy.py'
Oct 01 13:50:48 compute-0 sudo[173468]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:50:49 compute-0 podman[173418]: 2025-10-01 13:50:49.003908905 +0000 UTC m=+0.128314994 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct 01 13:50:49 compute-0 podman[173419]: 2025-10-01 13:50:49.051822202 +0000 UTC m=+0.175562754 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible)
Oct 01 13:50:49 compute-0 sudo[173468]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:49 compute-0 sudo[173639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-euepfjcrdvidetttughymvhzuznvfidf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326649.5687983-1560-25301854763616/AnsiballZ_container_config_data.py'
Oct 01 13:50:49 compute-0 sudo[173639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:50:50 compute-0 python3.9[173641]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Oct 01 13:50:50 compute-0 sudo[173639]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:50 compute-0 sudo[173791]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aogslhyfmgueliqradvnnfdhuwdmrnmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326650.558172-1578-15712063553765/AnsiballZ_container_config_hash.py'
Oct 01 13:50:50 compute-0 sudo[173791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:50:51 compute-0 python3.9[173793]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 01 13:50:51 compute-0 sudo[173791]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:51 compute-0 sudo[173943]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-asftvsgrnjhyublfketdygqdbnwyzxgq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326651.5749137-1596-139191763255052/AnsiballZ_podman_container_info.py'
Oct 01 13:50:51 compute-0 sudo[173943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:50:52 compute-0 python3.9[173945]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct 01 13:50:52 compute-0 sudo[173943]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:53 compute-0 sudo[174122]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fgnjzfbquzbtibihgvbgjserhmvnhzwe ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759326653.1567686-1622-109214375607116/AnsiballZ_edpm_container_manage.py'
Oct 01 13:50:53 compute-0 sudo[174122]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:50:53 compute-0 python3[174124]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct 01 13:50:54 compute-0 podman[174162]: 2025-10-01 13:50:54.05936586 +0000 UTC m=+0.067624949 container create d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, config_id=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest)
Oct 01 13:50:54 compute-0 podman[174162]: 2025-10-01 13:50:54.021073288 +0000 UTC m=+0.029332367 image pull cb9980503d2e559b80f837e5c1ae5a83d16cee2e99b876ecd89624c1b09d1eaa 38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest
Oct 01 13:50:54 compute-0 python3[174124]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z 38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest
Oct 01 13:50:54 compute-0 sudo[174122]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:55 compute-0 sudo[174350]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rolaybsdmqpontasldmtunnogdbgvgps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326654.7406168-1638-158287251436205/AnsiballZ_stat.py'
Oct 01 13:50:55 compute-0 sudo[174350]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:50:55 compute-0 python3.9[174352]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 01 13:50:55 compute-0 sudo[174350]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:56 compute-0 sudo[174504]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ucuybgkbzgsksmdmowpgstqlprmedxfe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326655.6750362-1656-94443477952614/AnsiballZ_file.py'
Oct 01 13:50:56 compute-0 sudo[174504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:50:56 compute-0 python3.9[174506]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:50:56 compute-0 sudo[174504]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:56 compute-0 sudo[174580]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrtdaqkerqoyrwalednkxagawiwsostu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326655.6750362-1656-94443477952614/AnsiballZ_stat.py'
Oct 01 13:50:56 compute-0 sudo[174580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:50:56 compute-0 python3.9[174582]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 01 13:50:56 compute-0 sudo[174580]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:57 compute-0 sudo[174731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rkgkksqhkdpdkxiltphqywthfvzvhjpa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326656.870476-1656-194098493733154/AnsiballZ_copy.py'
Oct 01 13:50:57 compute-0 sudo[174731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:50:57 compute-0 python3.9[174733]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759326656.870476-1656-194098493733154/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:50:57 compute-0 sudo[174731]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:58 compute-0 sudo[174807]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykexzpkuzcuappysagmqwpmcmrdrqghy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326656.870476-1656-194098493733154/AnsiballZ_systemd.py'
Oct 01 13:50:58 compute-0 sudo[174807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:50:58 compute-0 python3.9[174809]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 01 13:50:58 compute-0 systemd[1]: Reloading.
Oct 01 13:50:58 compute-0 systemd-rc-local-generator[174837]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:50:58 compute-0 systemd-sysv-generator[174841]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:50:58 compute-0 sudo[174807]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:58 compute-0 sudo[174918]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxzivmwxnpftpwrmyohnrohcrbicwqov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326656.870476-1656-194098493733154/AnsiballZ_systemd.py'
Oct 01 13:50:58 compute-0 sudo[174918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:50:59 compute-0 python3.9[174920]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 01 13:50:59 compute-0 systemd[1]: Reloading.
Oct 01 13:50:59 compute-0 systemd-rc-local-generator[174951]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:50:59 compute-0 systemd-sysv-generator[174955]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:50:59 compute-0 systemd[1]: Starting multipathd container...
Oct 01 13:50:59 compute-0 systemd[1]: Started libcrun container.
Oct 01 13:50:59 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/599eebfeb45a720a5f093c0ae5a4bd784b963e7d1275f1d8cf127b7ac6da34e7/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Oct 01 13:50:59 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/599eebfeb45a720a5f093c0ae5a4bd784b963e7d1275f1d8cf127b7ac6da34e7/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct 01 13:50:59 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0.
Oct 01 13:50:59 compute-0 podman[174961]: 2025-10-01 13:50:59.863596276 +0000 UTC m=+0.179632190 container init d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible)
Oct 01 13:50:59 compute-0 multipathd[174977]: + sudo -E kolla_set_configs
Oct 01 13:50:59 compute-0 podman[174961]: 2025-10-01 13:50:59.894125184 +0000 UTC m=+0.210161208 container start d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, tcib_managed=true, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4)
Oct 01 13:50:59 compute-0 podman[174961]: multipathd
Oct 01 13:50:59 compute-0 sudo[174983]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Oct 01 13:50:59 compute-0 sudo[174983]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Oct 01 13:50:59 compute-0 systemd[1]: Started multipathd container.
Oct 01 13:50:59 compute-0 multipathd[174977]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 01 13:50:59 compute-0 multipathd[174977]: INFO:__main__:Validating config file
Oct 01 13:50:59 compute-0 multipathd[174977]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 01 13:50:59 compute-0 multipathd[174977]: INFO:__main__:Writing out command to execute
Oct 01 13:50:59 compute-0 sudo[174918]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:59 compute-0 sudo[174983]: pam_unix(sudo:session): session closed for user root
Oct 01 13:50:59 compute-0 multipathd[174977]: ++ cat /run_command
Oct 01 13:51:00 compute-0 multipathd[174977]: + CMD='/usr/sbin/multipathd -d'
Oct 01 13:51:00 compute-0 multipathd[174977]: + ARGS=
Oct 01 13:51:00 compute-0 multipathd[174977]: + sudo kolla_copy_cacerts
Oct 01 13:51:00 compute-0 sudo[174997]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Oct 01 13:51:00 compute-0 sudo[174997]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Oct 01 13:51:00 compute-0 sudo[174997]: pam_unix(sudo:session): session closed for user root
Oct 01 13:51:00 compute-0 multipathd[174977]: + [[ ! -n '' ]]
Oct 01 13:51:00 compute-0 multipathd[174977]: + . kolla_extend_start
Oct 01 13:51:00 compute-0 multipathd[174977]: Running command: '/usr/sbin/multipathd -d'
Oct 01 13:51:00 compute-0 multipathd[174977]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Oct 01 13:51:00 compute-0 multipathd[174977]: + umask 0022
Oct 01 13:51:00 compute-0 multipathd[174977]: + exec /usr/sbin/multipathd -d
Oct 01 13:51:00 compute-0 podman[174984]: 2025-10-01 13:51:00.046587325 +0000 UTC m=+0.133678076 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930)
Oct 01 13:51:00 compute-0 multipathd[174977]: 3020.644620 | multipathd v0.9.9: start up
Oct 01 13:51:00 compute-0 systemd[1]: d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0-1e2840138a80c49b.service: Main process exited, code=exited, status=1/FAILURE
Oct 01 13:51:00 compute-0 systemd[1]: d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0-1e2840138a80c49b.service: Failed with result 'exit-code'.
Oct 01 13:51:00 compute-0 multipathd[174977]: 3020.656340 | reconfigure: setting up paths and maps
Oct 01 13:51:00 compute-0 multipathd[174977]: 3020.662085 | _check_bindings_file: failed to read header from /etc/multipath/bindings
Oct 01 13:51:00 compute-0 multipathd[174977]: 3020.665845 | updated bindings file /etc/multipath/bindings
Oct 01 13:51:00 compute-0 python3.9[175166]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 01 13:51:01 compute-0 sudo[175318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qcfrgrmdgfunfnstdclvymhtklxwwsir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326660.9691737-1728-57209988872561/AnsiballZ_command.py'
Oct 01 13:51:01 compute-0 sudo[175318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:51:01 compute-0 python3.9[175320]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:51:01 compute-0 sudo[175318]: pam_unix(sudo:session): session closed for user root
Oct 01 13:51:02 compute-0 sudo[175483]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxhxpuhyyowlybbemxmzokbeybrbgcof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326661.8623118-1744-13873343331773/AnsiballZ_systemd.py'
Oct 01 13:51:02 compute-0 sudo[175483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:51:02 compute-0 python3.9[175485]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 01 13:51:02 compute-0 systemd[1]: Stopping multipathd container...
Oct 01 13:51:02 compute-0 multipathd[174977]: 3023.318511 | multipathd: shut down
Oct 01 13:51:02 compute-0 systemd[1]: libpod-d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0.scope: Deactivated successfully.
Oct 01 13:51:02 compute-0 podman[175489]: 2025-10-01 13:51:02.754895768 +0000 UTC m=+0.073045703 container died d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true)
Oct 01 13:51:02 compute-0 systemd[1]: d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0-1e2840138a80c49b.timer: Deactivated successfully.
Oct 01 13:51:02 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0.
Oct 01 13:51:02 compute-0 systemd[1]: var-lib-containers-storage-overlay-599eebfeb45a720a5f093c0ae5a4bd784b963e7d1275f1d8cf127b7ac6da34e7-merged.mount: Deactivated successfully.
Oct 01 13:51:02 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0-userdata-shm.mount: Deactivated successfully.
Oct 01 13:51:02 compute-0 podman[175489]: 2025-10-01 13:51:02.811149615 +0000 UTC m=+0.129299520 container cleanup d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct 01 13:51:02 compute-0 podman[175489]: multipathd
Oct 01 13:51:02 compute-0 podman[175516]: multipathd
Oct 01 13:51:02 compute-0 systemd[1]: edpm_multipathd.service: Deactivated successfully.
Oct 01 13:51:02 compute-0 systemd[1]: Stopped multipathd container.
Oct 01 13:51:02 compute-0 systemd[1]: Starting multipathd container...
Oct 01 13:51:03 compute-0 systemd[1]: Started libcrun container.
Oct 01 13:51:03 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/599eebfeb45a720a5f093c0ae5a4bd784b963e7d1275f1d8cf127b7ac6da34e7/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Oct 01 13:51:03 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/599eebfeb45a720a5f093c0ae5a4bd784b963e7d1275f1d8cf127b7ac6da34e7/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct 01 13:51:03 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0.
Oct 01 13:51:03 compute-0 podman[175529]: 2025-10-01 13:51:03.100614509 +0000 UTC m=+0.154568018 container init d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Oct 01 13:51:03 compute-0 multipathd[175545]: + sudo -E kolla_set_configs
Oct 01 13:51:03 compute-0 sudo[175551]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Oct 01 13:51:03 compute-0 sudo[175551]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Oct 01 13:51:03 compute-0 podman[175529]: 2025-10-01 13:51:03.143610256 +0000 UTC m=+0.197563725 container start d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS)
Oct 01 13:51:03 compute-0 podman[175529]: multipathd
Oct 01 13:51:03 compute-0 systemd[1]: Started multipathd container.
Oct 01 13:51:03 compute-0 multipathd[175545]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 01 13:51:03 compute-0 multipathd[175545]: INFO:__main__:Validating config file
Oct 01 13:51:03 compute-0 multipathd[175545]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 01 13:51:03 compute-0 multipathd[175545]: INFO:__main__:Writing out command to execute
Oct 01 13:51:03 compute-0 sudo[175483]: pam_unix(sudo:session): session closed for user root
Oct 01 13:51:03 compute-0 sudo[175551]: pam_unix(sudo:session): session closed for user root
Oct 01 13:51:03 compute-0 multipathd[175545]: ++ cat /run_command
Oct 01 13:51:03 compute-0 multipathd[175545]: + CMD='/usr/sbin/multipathd -d'
Oct 01 13:51:03 compute-0 multipathd[175545]: + ARGS=
Oct 01 13:51:03 compute-0 multipathd[175545]: + sudo kolla_copy_cacerts
Oct 01 13:51:03 compute-0 sudo[175571]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Oct 01 13:51:03 compute-0 sudo[175571]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Oct 01 13:51:03 compute-0 sudo[175571]: pam_unix(sudo:session): session closed for user root
Oct 01 13:51:03 compute-0 multipathd[175545]: Running command: '/usr/sbin/multipathd -d'
Oct 01 13:51:03 compute-0 multipathd[175545]: + [[ ! -n '' ]]
Oct 01 13:51:03 compute-0 multipathd[175545]: + . kolla_extend_start
Oct 01 13:51:03 compute-0 multipathd[175545]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Oct 01 13:51:03 compute-0 multipathd[175545]: + umask 0022
Oct 01 13:51:03 compute-0 multipathd[175545]: + exec /usr/sbin/multipathd -d
Oct 01 13:51:03 compute-0 podman[175552]: 2025-10-01 13:51:03.24434926 +0000 UTC m=+0.076478203 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 01 13:51:03 compute-0 multipathd[175545]: 3023.840205 | multipathd v0.9.9: start up
Oct 01 13:51:03 compute-0 multipathd[175545]: 3023.846153 | reconfigure: setting up paths and maps
Oct 01 13:51:03 compute-0 systemd[1]: d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0-1f96698f78231c16.service: Main process exited, code=exited, status=1/FAILURE
Oct 01 13:51:03 compute-0 systemd[1]: d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0-1f96698f78231c16.service: Failed with result 'exit-code'.
Oct 01 13:51:03 compute-0 podman[175594]: 2025-10-01 13:51:03.348106254 +0000 UTC m=+0.068673057 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250930, container_name=iscsid, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Oct 01 13:51:03 compute-0 sudo[175754]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eaacphzyzvnkytpzcjdptdwwpselilxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326663.4220138-1760-147204402034261/AnsiballZ_file.py'
Oct 01 13:51:03 compute-0 sudo[175754]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:51:03 compute-0 python3.9[175756]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:51:04 compute-0 sudo[175754]: pam_unix(sudo:session): session closed for user root
Oct 01 13:51:04 compute-0 sudo[175906]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nikrtqjptbjsqepeywqngxhxkqybfpwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326664.4421232-1784-100694387376359/AnsiballZ_file.py'
Oct 01 13:51:04 compute-0 sudo[175906]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:51:05 compute-0 python3.9[175908]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct 01 13:51:05 compute-0 sudo[175906]: pam_unix(sudo:session): session closed for user root
Oct 01 13:51:05 compute-0 sudo[176058]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfeqcxzvhwgztrzobizqotakjhhxsowl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326665.3337522-1800-224020047030977/AnsiballZ_modprobe.py'
Oct 01 13:51:05 compute-0 sudo[176058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:51:05 compute-0 python3.9[176060]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Oct 01 13:51:05 compute-0 kernel: Key type psk registered
Oct 01 13:51:06 compute-0 sudo[176058]: pam_unix(sudo:session): session closed for user root
Oct 01 13:51:06 compute-0 sudo[176219]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwahbzjawjjqlvygjhvwyvrzbdhqxiun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326666.2927048-1816-236646686241861/AnsiballZ_stat.py'
Oct 01 13:51:06 compute-0 sudo[176219]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:51:06 compute-0 python3.9[176221]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:51:06 compute-0 sudo[176219]: pam_unix(sudo:session): session closed for user root
Oct 01 13:51:07 compute-0 sudo[176342]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpitoemoprtnmaxwdjfnbfovepmrbnyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326666.2927048-1816-236646686241861/AnsiballZ_copy.py'
Oct 01 13:51:07 compute-0 sudo[176342]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:51:07 compute-0 python3.9[176344]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759326666.2927048-1816-236646686241861/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:51:07 compute-0 sudo[176342]: pam_unix(sudo:session): session closed for user root
Oct 01 13:51:08 compute-0 sudo[176494]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ipzcqqktxdbupkcprdywlqgfmsoawmtz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326667.9768195-1848-257180779384078/AnsiballZ_lineinfile.py'
Oct 01 13:51:08 compute-0 sudo[176494]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:51:08 compute-0 python3.9[176496]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:51:08 compute-0 sudo[176494]: pam_unix(sudo:session): session closed for user root
Oct 01 13:51:09 compute-0 sudo[176646]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pismqoybdzubcyiirugssqjnuregbzdc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326668.8428032-1864-111319880180947/AnsiballZ_systemd.py'
Oct 01 13:51:09 compute-0 sudo[176646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:51:09 compute-0 python3.9[176648]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 01 13:51:09 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Oct 01 13:51:09 compute-0 systemd[1]: Stopped Load Kernel Modules.
Oct 01 13:51:09 compute-0 systemd[1]: Stopping Load Kernel Modules...
Oct 01 13:51:09 compute-0 systemd[1]: Starting Load Kernel Modules...
Oct 01 13:51:09 compute-0 systemd[1]: Finished Load Kernel Modules.
Oct 01 13:51:09 compute-0 sudo[176646]: pam_unix(sudo:session): session closed for user root
Oct 01 13:51:10 compute-0 sudo[176802]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbnuyjjnygjpyrqwgcwxybojmawmzond ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326670.0241852-1880-34363414163865/AnsiballZ_setup.py'
Oct 01 13:51:10 compute-0 sudo[176802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:51:10 compute-0 python3.9[176804]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct 01 13:51:11 compute-0 sudo[176802]: pam_unix(sudo:session): session closed for user root
Oct 01 13:51:11 compute-0 sudo[176886]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfillthgcjkmgbcqmycoaicmhggnxfhy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326670.0241852-1880-34363414163865/AnsiballZ_dnf.py'
Oct 01 13:51:11 compute-0 sudo[176886]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:51:11 compute-0 python3.9[176888]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct 01 13:51:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:51:14.198 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 13:51:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:51:14.198 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 13:51:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:51:14.198 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 13:51:17 compute-0 systemd[1]: Reloading.
Oct 01 13:51:18 compute-0 systemd-rc-local-generator[176913]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:51:18 compute-0 systemd-sysv-generator[176920]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:51:18 compute-0 systemd[1]: Reloading.
Oct 01 13:51:18 compute-0 systemd-rc-local-generator[176957]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:51:18 compute-0 systemd-sysv-generator[176960]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:51:18 compute-0 systemd-logind[791]: Watching system buttons on /dev/input/event0 (Power Button)
Oct 01 13:51:18 compute-0 systemd-logind[791]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Oct 01 13:51:18 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct 01 13:51:18 compute-0 systemd[1]: Starting man-db-cache-update.service...
Oct 01 13:51:18 compute-0 systemd[1]: Reloading.
Oct 01 13:51:19 compute-0 systemd-sysv-generator[177057]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:51:19 compute-0 systemd-rc-local-generator[177054]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:51:19 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Oct 01 13:51:19 compute-0 podman[177079]: 2025-10-01 13:51:19.431622385 +0000 UTC m=+0.114690823 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 01 13:51:19 compute-0 podman[177088]: 2025-10-01 13:51:19.468496392 +0000 UTC m=+0.143888280 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Oct 01 13:51:20 compute-0 sudo[176886]: pam_unix(sudo:session): session closed for user root
Oct 01 13:51:20 compute-0 sudo[178166]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwogtflmssenoatocovswfrgkcopzhez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326680.2415514-1904-156509499337294/AnsiballZ_file.py'
Oct 01 13:51:20 compute-0 sudo[178166]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:51:20 compute-0 python3.9[178191]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.iscsid_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:51:20 compute-0 sudo[178166]: pam_unix(sudo:session): session closed for user root
Oct 01 13:51:21 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct 01 13:51:21 compute-0 systemd[1]: Finished man-db-cache-update.service.
Oct 01 13:51:21 compute-0 systemd[1]: man-db-cache-update.service: Consumed 2.569s CPU time.
Oct 01 13:51:21 compute-0 systemd[1]: run-r0413e85eec554575b4c6ba5b13dd3fad.service: Deactivated successfully.
Oct 01 13:51:21 compute-0 python3.9[178539]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 01 13:51:22 compute-0 sudo[178693]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awlmpzqodshefynlphhvxxqodixtqiox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326682.358564-1939-125205394840132/AnsiballZ_file.py'
Oct 01 13:51:22 compute-0 sudo[178693]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:51:22 compute-0 python3.9[178695]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:51:22 compute-0 sudo[178693]: pam_unix(sudo:session): session closed for user root
Oct 01 13:51:24 compute-0 sudo[178845]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kxwexlpsvvpwzvhlyepptkdgieqygycc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326683.3866715-1961-13741107808693/AnsiballZ_systemd_service.py'
Oct 01 13:51:24 compute-0 sudo[178845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:51:24 compute-0 python3.9[178847]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 01 13:51:24 compute-0 systemd[1]: Reloading.
Oct 01 13:51:24 compute-0 systemd-rc-local-generator[178875]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:51:24 compute-0 systemd-sysv-generator[178878]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:51:24 compute-0 sudo[178845]: pam_unix(sudo:session): session closed for user root
Oct 01 13:51:25 compute-0 python3.9[179032]: ansible-ansible.builtin.service_facts Invoked
Oct 01 13:51:25 compute-0 network[179049]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 01 13:51:25 compute-0 network[179050]: 'network-scripts' will be removed from distribution in near future.
Oct 01 13:51:25 compute-0 network[179051]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 01 13:51:32 compute-0 sudo[179326]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzokhafsfanichnqxwxksabjswxpjejd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326692.317524-1999-194477041518706/AnsiballZ_systemd_service.py'
Oct 01 13:51:32 compute-0 sudo[179326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:51:33 compute-0 python3.9[179328]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 01 13:51:33 compute-0 sudo[179326]: pam_unix(sudo:session): session closed for user root
Oct 01 13:51:33 compute-0 sudo[179509]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjhbtmkvujvuufeuhuizfnkhzrjsocmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326693.2643495-1999-140908682273567/AnsiballZ_systemd_service.py'
Oct 01 13:51:33 compute-0 sudo[179509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:51:33 compute-0 podman[179454]: 2025-10-01 13:51:33.655612435 +0000 UTC m=+0.088920290 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=multipathd, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct 01 13:51:33 compute-0 podman[179453]: 2025-10-01 13:51:33.683715783 +0000 UTC m=+0.123195896 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=iscsid)
Oct 01 13:51:33 compute-0 python3.9[179519]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 01 13:51:34 compute-0 sudo[179509]: pam_unix(sudo:session): session closed for user root
Oct 01 13:51:34 compute-0 sudo[179670]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwohpwqgmvxzneeykuziqwkcnhwwwidd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326694.1743448-1999-170914677479917/AnsiballZ_systemd_service.py'
Oct 01 13:51:34 compute-0 sudo[179670]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:51:34 compute-0 python3.9[179672]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 01 13:51:34 compute-0 sudo[179670]: pam_unix(sudo:session): session closed for user root
Oct 01 13:51:35 compute-0 sudo[179823]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sblevltvgitnljgjynvqstmoejhzguys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326695.0588825-1999-4000098854230/AnsiballZ_systemd_service.py'
Oct 01 13:51:35 compute-0 sudo[179823]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:51:35 compute-0 python3.9[179825]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 01 13:51:35 compute-0 sudo[179823]: pam_unix(sudo:session): session closed for user root
Oct 01 13:51:36 compute-0 sudo[179976]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cpmmjjmkfsxkkuwslbwsxhhqiivtperf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326695.9399338-1999-128335796234455/AnsiballZ_systemd_service.py'
Oct 01 13:51:36 compute-0 sudo[179976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:51:36 compute-0 python3.9[179978]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 01 13:51:36 compute-0 sudo[179976]: pam_unix(sudo:session): session closed for user root
Oct 01 13:51:37 compute-0 sudo[180129]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qokobnxgbmczkgflvqoimtwnlkyzfghr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326696.926888-1999-176989338892504/AnsiballZ_systemd_service.py'
Oct 01 13:51:37 compute-0 sudo[180129]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:51:37 compute-0 python3.9[180131]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 01 13:51:37 compute-0 sudo[180129]: pam_unix(sudo:session): session closed for user root
Oct 01 13:51:38 compute-0 sudo[180282]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sssgexxedurvgnjcmqyuctodmvngpxux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326697.8049767-1999-213288178243484/AnsiballZ_systemd_service.py'
Oct 01 13:51:38 compute-0 sudo[180282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:51:38 compute-0 python3.9[180284]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 01 13:51:38 compute-0 sudo[180282]: pam_unix(sudo:session): session closed for user root
Oct 01 13:51:39 compute-0 sudo[180435]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-luxpmzxwbfthbsvjeioblgyepmphiyws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326698.7388546-1999-253196881356777/AnsiballZ_systemd_service.py'
Oct 01 13:51:39 compute-0 sudo[180435]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:51:39 compute-0 python3.9[180437]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 01 13:51:39 compute-0 sudo[180435]: pam_unix(sudo:session): session closed for user root
Oct 01 13:51:40 compute-0 sudo[180588]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylenzigzrdkuqwbyxmbiodoyjmvafiog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326699.7923663-2117-242538370275418/AnsiballZ_file.py'
Oct 01 13:51:40 compute-0 sudo[180588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:51:40 compute-0 python3.9[180590]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:51:40 compute-0 sudo[180588]: pam_unix(sudo:session): session closed for user root
Oct 01 13:51:40 compute-0 sudo[180740]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aolconidudfviclpqdanqwamwzchkyws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326700.5319564-2117-21282776268512/AnsiballZ_file.py'
Oct 01 13:51:40 compute-0 sudo[180740]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:51:41 compute-0 python3.9[180742]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:51:41 compute-0 sudo[180740]: pam_unix(sudo:session): session closed for user root
Oct 01 13:51:41 compute-0 sudo[180892]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-crzfasujcuvnzgwavyxcfmfijqbntwcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326701.2645566-2117-51833044121768/AnsiballZ_file.py'
Oct 01 13:51:41 compute-0 sudo[180892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:51:41 compute-0 python3.9[180894]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:51:41 compute-0 sudo[180892]: pam_unix(sudo:session): session closed for user root
Oct 01 13:51:42 compute-0 sudo[181044]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnrgkqjxjipqbugwqmixyxdrjqzeanip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326701.9854872-2117-131524970451456/AnsiballZ_file.py'
Oct 01 13:51:42 compute-0 sudo[181044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:51:42 compute-0 python3.9[181046]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:51:42 compute-0 sudo[181044]: pam_unix(sudo:session): session closed for user root
Oct 01 13:51:43 compute-0 sudo[181196]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eolxlrcfmmnbhcsxxblksqyxtztbbron ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326702.7968998-2117-130398555643157/AnsiballZ_file.py'
Oct 01 13:51:43 compute-0 sudo[181196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:51:43 compute-0 python3.9[181198]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:51:43 compute-0 sudo[181196]: pam_unix(sudo:session): session closed for user root
Oct 01 13:51:43 compute-0 sudo[181348]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wyoahkmuhqnhhmocfmveefivzsfricmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326703.5597174-2117-251833239681173/AnsiballZ_file.py'
Oct 01 13:51:43 compute-0 sudo[181348]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:51:44 compute-0 python3.9[181350]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:51:44 compute-0 sudo[181348]: pam_unix(sudo:session): session closed for user root
Oct 01 13:51:44 compute-0 sudo[181500]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwqvxhbkmxgfrppyyskcgtvbcaiimtbe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326704.3915362-2117-18573398190603/AnsiballZ_file.py'
Oct 01 13:51:44 compute-0 sudo[181500]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:51:45 compute-0 python3.9[181502]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:51:45 compute-0 sudo[181500]: pam_unix(sudo:session): session closed for user root
Oct 01 13:51:45 compute-0 sudo[181652]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mzqvbbikvdgvcixstwvcozfamnlyzwtr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326705.2205617-2117-60971071969789/AnsiballZ_file.py'
Oct 01 13:51:45 compute-0 sudo[181652]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:51:45 compute-0 python3.9[181654]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:51:45 compute-0 sudo[181652]: pam_unix(sudo:session): session closed for user root
Oct 01 13:51:46 compute-0 sudo[181804]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjzwzjfgvfkdzahqxualerzukeuwsbpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326706.0242536-2231-119643273758280/AnsiballZ_file.py'
Oct 01 13:51:46 compute-0 sudo[181804]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:51:46 compute-0 python3.9[181806]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:51:46 compute-0 sudo[181804]: pam_unix(sudo:session): session closed for user root
Oct 01 13:51:47 compute-0 sudo[181956]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bdjjuwqwxwytrkwuzuzhelzimucxapzq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326706.8416555-2231-93096839772297/AnsiballZ_file.py'
Oct 01 13:51:47 compute-0 sudo[181956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:51:47 compute-0 python3.9[181958]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:51:47 compute-0 sudo[181956]: pam_unix(sudo:session): session closed for user root
Oct 01 13:51:48 compute-0 sudo[182108]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwjhkpisnpzhsaonuhzfrngetgwatana ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326707.6507535-2231-242281907477218/AnsiballZ_file.py'
Oct 01 13:51:48 compute-0 sudo[182108]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:51:48 compute-0 python3.9[182110]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:51:48 compute-0 sudo[182108]: pam_unix(sudo:session): session closed for user root
Oct 01 13:51:48 compute-0 sudo[182260]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcpqfwmpxatlacrjdocljwmsaykgekwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326708.4481127-2231-267122519173585/AnsiballZ_file.py'
Oct 01 13:51:48 compute-0 sudo[182260]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:51:49 compute-0 python3.9[182262]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:51:49 compute-0 sudo[182260]: pam_unix(sudo:session): session closed for user root
Oct 01 13:51:49 compute-0 sudo[182436]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxfgoqkqktuydhvlobjioiqzndftrkjc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326709.3094351-2231-10023967926899/AnsiballZ_file.py'
Oct 01 13:51:49 compute-0 sudo[182436]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:51:49 compute-0 podman[182386]: 2025-10-01 13:51:49.776879472 +0000 UTC m=+0.123629477 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Oct 01 13:51:49 compute-0 podman[182387]: 2025-10-01 13:51:49.880368989 +0000 UTC m=+0.223750002 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct 01 13:51:49 compute-0 python3.9[182455]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:51:49 compute-0 sudo[182436]: pam_unix(sudo:session): session closed for user root
Oct 01 13:51:50 compute-0 sudo[182612]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvmzbcoebljlrbulufdhonaarxybdkor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326710.1396778-2231-91581437877319/AnsiballZ_file.py'
Oct 01 13:51:50 compute-0 sudo[182612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:51:50 compute-0 python3.9[182614]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:51:50 compute-0 sudo[182612]: pam_unix(sudo:session): session closed for user root
Oct 01 13:51:51 compute-0 sudo[182764]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjzzeqzlnpvnymqretgaoohkfbqqlqbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326710.898864-2231-232412100035850/AnsiballZ_file.py'
Oct 01 13:51:51 compute-0 sudo[182764]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:51:51 compute-0 python3.9[182766]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:51:51 compute-0 sudo[182764]: pam_unix(sudo:session): session closed for user root
Oct 01 13:51:52 compute-0 sudo[182916]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-urbspytzsyerkecsehbbeskafrmzwrli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326711.613446-2231-109920687237262/AnsiballZ_file.py'
Oct 01 13:51:52 compute-0 sudo[182916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:51:52 compute-0 python3.9[182918]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:51:52 compute-0 sudo[182916]: pam_unix(sudo:session): session closed for user root
Oct 01 13:51:52 compute-0 sudo[183068]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cnwocrmildrmlwjvtytcjiuocpkjmbom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326712.5969515-2347-200745865758066/AnsiballZ_command.py'
Oct 01 13:51:52 compute-0 sudo[183068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:51:53 compute-0 python3.9[183070]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:51:53 compute-0 sudo[183068]: pam_unix(sudo:session): session closed for user root
Oct 01 13:51:54 compute-0 python3.9[183222]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct 01 13:51:54 compute-0 sudo[183372]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qyhgwajyfcktkldifggfqwpnbrudrcrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326714.531759-2383-7030840859027/AnsiballZ_systemd_service.py'
Oct 01 13:51:54 compute-0 sudo[183372]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:51:55 compute-0 python3.9[183374]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 01 13:51:55 compute-0 systemd[1]: Reloading.
Oct 01 13:51:55 compute-0 systemd-rc-local-generator[183403]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:51:55 compute-0 systemd-sysv-generator[183406]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:51:55 compute-0 sudo[183372]: pam_unix(sudo:session): session closed for user root
Oct 01 13:51:56 compute-0 sudo[183559]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdhrfvibxoekwuwppudemcbflrcnuogl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326715.8496935-2399-237979954431859/AnsiballZ_command.py'
Oct 01 13:51:56 compute-0 sudo[183559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:51:56 compute-0 python3.9[183561]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:51:56 compute-0 sudo[183559]: pam_unix(sudo:session): session closed for user root
Oct 01 13:51:57 compute-0 sudo[183712]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tnaxtpkemectqaqspzqlcqjulqqweqxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326716.6711583-2399-200470301089760/AnsiballZ_command.py'
Oct 01 13:51:57 compute-0 sudo[183712]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:51:57 compute-0 python3.9[183714]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:51:57 compute-0 sudo[183712]: pam_unix(sudo:session): session closed for user root
Oct 01 13:51:57 compute-0 sudo[183865]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uuhuzfhutlfvypzplkjkqkgnhghglnze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326717.4243062-2399-257880340520935/AnsiballZ_command.py'
Oct 01 13:51:57 compute-0 sudo[183865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:51:57 compute-0 python3.9[183867]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:51:57 compute-0 sudo[183865]: pam_unix(sudo:session): session closed for user root
Oct 01 13:51:58 compute-0 sudo[184018]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ggampoofrelgjzpaeeiyablwfzhewovz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326718.1678128-2399-193334207805340/AnsiballZ_command.py'
Oct 01 13:51:58 compute-0 sudo[184018]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:51:58 compute-0 python3.9[184020]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:51:58 compute-0 sudo[184018]: pam_unix(sudo:session): session closed for user root
Oct 01 13:51:59 compute-0 sudo[184171]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iyhvwjmfftuodxxeibfdtgyllqzaynld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326718.956844-2399-140363972193976/AnsiballZ_command.py'
Oct 01 13:51:59 compute-0 sudo[184171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:51:59 compute-0 python3.9[184173]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:51:59 compute-0 sudo[184171]: pam_unix(sudo:session): session closed for user root
Oct 01 13:51:59 compute-0 sudo[184324]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-senlyvqsorzlvjviewvmorwggdgwptpy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326719.6568398-2399-263498998066670/AnsiballZ_command.py'
Oct 01 13:52:00 compute-0 sudo[184324]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:52:00 compute-0 python3.9[184326]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:52:00 compute-0 sudo[184324]: pam_unix(sudo:session): session closed for user root
Oct 01 13:52:00 compute-0 sudo[184477]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sqaliyzvtsmszfkaudcxfyyrhdevevdi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326720.4049997-2399-241418923744345/AnsiballZ_command.py'
Oct 01 13:52:00 compute-0 sudo[184477]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:52:00 compute-0 python3.9[184479]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:52:01 compute-0 sudo[184477]: pam_unix(sudo:session): session closed for user root
Oct 01 13:52:01 compute-0 sudo[184630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbpincqrqwemtpsicidrilqwjooctwdn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326721.1889887-2399-151698532684532/AnsiballZ_command.py'
Oct 01 13:52:01 compute-0 sudo[184630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:52:01 compute-0 python3.9[184632]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:52:01 compute-0 sudo[184630]: pam_unix(sudo:session): session closed for user root
Oct 01 13:52:03 compute-0 sudo[184783]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-okqwuczopmpbioovkjidbnbdsmtshnlw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326722.6685607-2542-45872169175730/AnsiballZ_file.py'
Oct 01 13:52:03 compute-0 sudo[184783]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:52:03 compute-0 python3.9[184785]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:52:03 compute-0 sudo[184783]: pam_unix(sudo:session): session closed for user root
Oct 01 13:52:03 compute-0 sudo[184947]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmsqsemwgxwqbrtmykeqpkptdtgalwpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326723.425661-2542-197984451289255/AnsiballZ_file.py'
Oct 01 13:52:03 compute-0 sudo[184947]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:52:03 compute-0 podman[184909]: 2025-10-01 13:52:03.767098387 +0000 UTC m=+0.067988618 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 01 13:52:03 compute-0 podman[184954]: 2025-10-01 13:52:03.835847975 +0000 UTC m=+0.068360738 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, container_name=iscsid, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct 01 13:52:03 compute-0 python3.9[184958]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:52:04 compute-0 sudo[184947]: pam_unix(sudo:session): session closed for user root
Oct 01 13:52:04 compute-0 sudo[185128]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upyqsxvilbaemqgajmorfezzcaqdvcna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326724.1526995-2542-110597386695198/AnsiballZ_file.py'
Oct 01 13:52:04 compute-0 sudo[185128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:52:04 compute-0 python3.9[185130]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:52:04 compute-0 sudo[185128]: pam_unix(sudo:session): session closed for user root
Oct 01 13:52:05 compute-0 sudo[185280]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nljwndxmskjqfcdnuhsoyfcmkvhuliyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326724.9219725-2586-64978599857885/AnsiballZ_file.py'
Oct 01 13:52:05 compute-0 sudo[185280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:52:05 compute-0 python3.9[185282]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:52:05 compute-0 sudo[185280]: pam_unix(sudo:session): session closed for user root
Oct 01 13:52:06 compute-0 sudo[185432]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shqeylrkpzmnjgnhrfbilimpljcxihdy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326725.7138214-2586-34654282867191/AnsiballZ_file.py'
Oct 01 13:52:06 compute-0 sudo[185432]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:52:06 compute-0 python3.9[185434]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:52:06 compute-0 sudo[185432]: pam_unix(sudo:session): session closed for user root
Oct 01 13:52:06 compute-0 sudo[185584]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhmftoqxmozqbdfbfthfqadyuungilqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326726.4166622-2586-47358492961256/AnsiballZ_file.py'
Oct 01 13:52:06 compute-0 sudo[185584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:52:06 compute-0 python3.9[185586]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:52:07 compute-0 sudo[185584]: pam_unix(sudo:session): session closed for user root
Oct 01 13:52:07 compute-0 sudo[185736]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-doizhsulazwyzovrfczeeppxtlktbgbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326727.1649368-2586-143565868187286/AnsiballZ_file.py'
Oct 01 13:52:07 compute-0 sudo[185736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:52:07 compute-0 python3.9[185738]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:52:07 compute-0 sudo[185736]: pam_unix(sudo:session): session closed for user root
Oct 01 13:52:08 compute-0 sudo[185888]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qujhgsfdelgcverdhucjlffjazpdetfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326727.9412954-2586-132737616470140/AnsiballZ_file.py'
Oct 01 13:52:08 compute-0 sudo[185888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:52:08 compute-0 python3.9[185890]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:52:08 compute-0 sudo[185888]: pam_unix(sudo:session): session closed for user root
Oct 01 13:52:09 compute-0 sudo[186040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hsnyvuuqzwssbacagollnvjyssivgvpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326728.747807-2586-18562472506405/AnsiballZ_file.py'
Oct 01 13:52:09 compute-0 sudo[186040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:52:09 compute-0 python3.9[186042]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:52:09 compute-0 sudo[186040]: pam_unix(sudo:session): session closed for user root
Oct 01 13:52:09 compute-0 sudo[186192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifafyletjykbwwdfgrakvduyknxbpffo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326729.5241141-2586-36330719932261/AnsiballZ_file.py'
Oct 01 13:52:09 compute-0 sudo[186192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:52:10 compute-0 python3.9[186194]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:52:10 compute-0 sudo[186192]: pam_unix(sudo:session): session closed for user root
Oct 01 13:52:10 compute-0 sudo[186344]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyqugaeaeauekfaayckxtceashqajtln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326730.3009226-2586-230890510082403/AnsiballZ_file.py'
Oct 01 13:52:10 compute-0 sudo[186344]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:52:10 compute-0 python3.9[186346]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:52:10 compute-0 sudo[186344]: pam_unix(sudo:session): session closed for user root
Oct 01 13:52:11 compute-0 sudo[186496]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cgpntsakunaeyggluonlqcypyizsoydi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326731.0555384-2586-139431206155659/AnsiballZ_file.py'
Oct 01 13:52:11 compute-0 sudo[186496]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:52:11 compute-0 python3.9[186498]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:52:11 compute-0 sudo[186496]: pam_unix(sudo:session): session closed for user root
Oct 01 13:52:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:52:14.199 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 13:52:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:52:14.200 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 13:52:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:52:14.200 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 13:52:16 compute-0 sudo[186649]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvjpqgswggxwlfkxghlxgpwsqjavioax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326736.020438-2851-154987035449831/AnsiballZ_getent.py'
Oct 01 13:52:16 compute-0 sudo[186649]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:52:16 compute-0 python3.9[186651]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Oct 01 13:52:16 compute-0 sudo[186649]: pam_unix(sudo:session): session closed for user root
Oct 01 13:52:17 compute-0 sudo[186802]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yopryfvvtspltijnrtrjmtagfifubgvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326737.065915-2867-259502325507314/AnsiballZ_group.py'
Oct 01 13:52:17 compute-0 sudo[186802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:52:17 compute-0 python3.9[186804]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct 01 13:52:17 compute-0 groupadd[186805]: group added to /etc/group: name=nova, GID=42436
Oct 01 13:52:17 compute-0 groupadd[186805]: group added to /etc/gshadow: name=nova
Oct 01 13:52:17 compute-0 groupadd[186805]: new group: name=nova, GID=42436
Oct 01 13:52:17 compute-0 sudo[186802]: pam_unix(sudo:session): session closed for user root
Oct 01 13:52:18 compute-0 sudo[186960]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdcfmzuxmzymvglwfrupmcdvrafffxip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326738.1415787-2883-182866132929907/AnsiballZ_user.py'
Oct 01 13:52:18 compute-0 sudo[186960]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:52:19 compute-0 python3.9[186962]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct 01 13:52:19 compute-0 useradd[186964]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/0
Oct 01 13:52:19 compute-0 useradd[186964]: add 'nova' to group 'libvirt'
Oct 01 13:52:19 compute-0 useradd[186964]: add 'nova' to shadow group 'libvirt'
Oct 01 13:52:19 compute-0 sudo[186960]: pam_unix(sudo:session): session closed for user root
Oct 01 13:52:19 compute-0 sshd-session[186995]: Accepted publickey for zuul from 192.168.122.30 port 32882 ssh2: ECDSA SHA256:G/wBH4NemtaB5A4Xrsc6R+GZmi6HC8VbviS/FKhdd8M
Oct 01 13:52:19 compute-0 systemd-logind[791]: New session 27 of user zuul.
Oct 01 13:52:19 compute-0 systemd[1]: Started Session 27 of User zuul.
Oct 01 13:52:20 compute-0 sshd-session[186995]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 01 13:52:20 compute-0 podman[186997]: 2025-10-01 13:52:20.104471098 +0000 UTC m=+0.119342207 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct 01 13:52:20 compute-0 podman[186999]: 2025-10-01 13:52:20.140959037 +0000 UTC m=+0.150841566 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Oct 01 13:52:20 compute-0 sshd-session[187017]: Received disconnect from 192.168.122.30 port 32882:11: disconnected by user
Oct 01 13:52:20 compute-0 sshd-session[187017]: Disconnected from user zuul 192.168.122.30 port 32882
Oct 01 13:52:20 compute-0 sshd-session[186995]: pam_unix(sshd:session): session closed for user zuul
Oct 01 13:52:20 compute-0 systemd[1]: session-27.scope: Deactivated successfully.
Oct 01 13:52:20 compute-0 systemd-logind[791]: Session 27 logged out. Waiting for processes to exit.
Oct 01 13:52:20 compute-0 systemd-logind[791]: Removed session 27.
Oct 01 13:52:21 compute-0 python3.9[187191]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:52:21 compute-0 python3.9[187312]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759326740.39841-2933-123404034286130/.source.json follow=False _original_basename=config.json.j2 checksum=2c2474b5f24ef7c9ed37f49680082593e0d1100b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:52:22 compute-0 python3.9[187462]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:52:22 compute-0 python3.9[187538]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:52:23 compute-0 python3.9[187688]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:52:24 compute-0 python3.9[187809]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759326743.1938643-2933-216254302761322/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:52:25 compute-0 python3.9[187959]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:52:25 compute-0 python3.9[188080]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759326744.6151178-2933-3070801239464/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=1feba546d0beacad9258164ab79b8a747685ccc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:52:26 compute-0 python3.9[188230]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:52:27 compute-0 python3.9[188351]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759326746.1151474-2933-117431760593255/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:52:28 compute-0 sudo[188501]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-deyxdzwwdgsvzswyhdeztqhmdtvkbaht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326747.7756917-3071-8422375978799/AnsiballZ_file.py'
Oct 01 13:52:28 compute-0 sudo[188501]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:52:28 compute-0 python3.9[188503]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:52:28 compute-0 sudo[188501]: pam_unix(sudo:session): session closed for user root
Oct 01 13:52:29 compute-0 sudo[188653]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unsgormtdzdcndzqidpmvjwusbfhdfpi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326748.6134763-3087-210553236237979/AnsiballZ_copy.py'
Oct 01 13:52:29 compute-0 sudo[188653]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:52:29 compute-0 python3.9[188655]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:52:29 compute-0 sudo[188653]: pam_unix(sudo:session): session closed for user root
Oct 01 13:52:29 compute-0 sudo[188805]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-idyrywirdbtfgwhknbazbxqxhekulbbi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326749.4646626-3103-102274866193442/AnsiballZ_stat.py'
Oct 01 13:52:29 compute-0 sudo[188805]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:52:30 compute-0 python3.9[188807]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 01 13:52:30 compute-0 sudo[188805]: pam_unix(sudo:session): session closed for user root
Oct 01 13:52:30 compute-0 sudo[188957]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdldtmoylessutvmrokuanwndzcfoldf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326750.3038816-3119-127167432194101/AnsiballZ_stat.py'
Oct 01 13:52:30 compute-0 sudo[188957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:52:30 compute-0 python3.9[188959]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:52:30 compute-0 sudo[188957]: pam_unix(sudo:session): session closed for user root
Oct 01 13:52:31 compute-0 sudo[189080]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qztjtetnspbgafmtdgczanmrajdtdzwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326750.3038816-3119-127167432194101/AnsiballZ_copy.py'
Oct 01 13:52:31 compute-0 sudo[189080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:52:31 compute-0 python3.9[189082]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1759326750.3038816-3119-127167432194101/.source _original_basename=.nxiyji3m follow=False checksum=d858193b8ad620b445fc81ad89066c3f89b0fbcf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Oct 01 13:52:31 compute-0 sudo[189080]: pam_unix(sudo:session): session closed for user root
Oct 01 13:52:32 compute-0 python3.9[189234]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 01 13:52:33 compute-0 python3.9[189386]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:52:34 compute-0 python3.9[189507]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759326752.7565448-3171-64148423844795/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=52e5c207b65a05937a65caa1823d79c347a7beb0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:52:34 compute-0 podman[189509]: 2025-10-01 13:52:34.153915877 +0000 UTC m=+0.083361150 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Oct 01 13:52:34 compute-0 podman[189508]: 2025-10-01 13:52:34.16092032 +0000 UTC m=+0.091854122 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 01 13:52:34 compute-0 python3.9[189698]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:52:35 compute-0 python3.9[189819]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759326754.290868-3201-158329572211316/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=3cf05d68d95be002f01ec016347c8ba2745fe64a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:52:36 compute-0 sudo[189969]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yinlluwjqnfuyfzmolgtdppdcxwdetek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326756.0011656-3235-136719661900625/AnsiballZ_container_config_data.py'
Oct 01 13:52:36 compute-0 sudo[189969]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:52:36 compute-0 python3.9[189971]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Oct 01 13:52:36 compute-0 sudo[189969]: pam_unix(sudo:session): session closed for user root
Oct 01 13:52:37 compute-0 sudo[190121]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjeukiflnjhcjshxmzbttvffuiekyqen ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326756.8949864-3253-60815402510471/AnsiballZ_container_config_hash.py'
Oct 01 13:52:37 compute-0 sudo[190121]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:52:37 compute-0 python3.9[190123]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 01 13:52:37 compute-0 sudo[190121]: pam_unix(sudo:session): session closed for user root
Oct 01 13:52:38 compute-0 sudo[190273]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxpnwgqpwbgyjqgkbimrztpojhtintmc ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759326757.8926883-3273-46477976433447/AnsiballZ_edpm_container_manage.py'
Oct 01 13:52:38 compute-0 sudo[190273]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:52:38 compute-0 python3[190275]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Oct 01 13:52:38 compute-0 podman[190313]: 2025-10-01 13:52:38.8419182 +0000 UTC m=+0.075500246 container create 327446013c3d3270de282d60169e48adec912db676e7868a25d3c28441158816 (image=38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, tcib_managed=true, io.buildah.version=1.41.4, config_id=edpm, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, container_name=nova_compute_init, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'image': '38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.license=GPLv2)
Oct 01 13:52:38 compute-0 podman[190313]: 2025-10-01 13:52:38.799923007 +0000 UTC m=+0.033505113 image pull 656799db0d65542d0e8e413e509a07d242723dfb9640eb11bd2cd711d3cef64f 38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest
Oct 01 13:52:38 compute-0 python3[190275]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': '38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z 38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Oct 01 13:52:39 compute-0 sudo[190273]: pam_unix(sudo:session): session closed for user root
Oct 01 13:52:39 compute-0 sudo[190502]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymaxjwiqnlikfhwlnklyrhggmtsxckvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326759.2966468-3289-255653324621130/AnsiballZ_stat.py'
Oct 01 13:52:39 compute-0 sudo[190502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:52:39 compute-0 python3.9[190504]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 01 13:52:39 compute-0 sudo[190502]: pam_unix(sudo:session): session closed for user root
Oct 01 13:52:40 compute-0 sudo[190656]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vcvcgopzohkjokmvjqctlcwqjnozytuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326760.4341443-3313-97432117672868/AnsiballZ_container_config_data.py'
Oct 01 13:52:40 compute-0 sudo[190656]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:52:41 compute-0 python3.9[190658]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Oct 01 13:52:41 compute-0 sudo[190656]: pam_unix(sudo:session): session closed for user root
Oct 01 13:52:41 compute-0 sudo[190808]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfboiwlexxksfvljplpxbgqmwijnegdh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326761.3717954-3331-105224016531586/AnsiballZ_container_config_hash.py'
Oct 01 13:52:41 compute-0 sudo[190808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:52:41 compute-0 python3.9[190810]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 01 13:52:41 compute-0 sudo[190808]: pam_unix(sudo:session): session closed for user root
Oct 01 13:52:42 compute-0 sudo[190960]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zjmmztjvdomklytyffnkyzgfdggecvpl ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759326762.3590565-3351-128764482287608/AnsiballZ_edpm_container_manage.py'
Oct 01 13:52:42 compute-0 sudo[190960]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:52:43 compute-0 python3[190962]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Oct 01 13:52:43 compute-0 podman[190999]: 2025-10-01 13:52:43.37649977 +0000 UTC m=+0.087941569 container create 7c962c375a8c736fe18d835c9da6241e095f9b62607ec3178c17c8d98a1d1cdf (image=38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, config_id=edpm, managed_by=edpm_ansible, config_data={'image': '38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=nova_compute, org.label-schema.vendor=CentOS)
Oct 01 13:52:43 compute-0 podman[190999]: 2025-10-01 13:52:43.329397884 +0000 UTC m=+0.040839723 image pull 656799db0d65542d0e8e413e509a07d242723dfb9640eb11bd2cd711d3cef64f 38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest
Oct 01 13:52:43 compute-0 python3[190962]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': '38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro 38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest kolla_start
Oct 01 13:52:43 compute-0 sudo[190960]: pam_unix(sudo:session): session closed for user root
Oct 01 13:52:44 compute-0 sudo[191187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ushtboopyumkijcticcgdmzsjfcgovga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326763.8450282-3367-245391381654059/AnsiballZ_stat.py'
Oct 01 13:52:44 compute-0 sudo[191187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:52:44 compute-0 python3.9[191189]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 01 13:52:44 compute-0 sudo[191187]: pam_unix(sudo:session): session closed for user root
Oct 01 13:52:45 compute-0 sudo[191341]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-voaudwwhyorrwkvtowtyxcqmcmbmvqij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326764.8220325-3385-156984594106850/AnsiballZ_file.py'
Oct 01 13:52:45 compute-0 sudo[191341]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:52:45 compute-0 python3.9[191343]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:52:45 compute-0 sudo[191341]: pam_unix(sudo:session): session closed for user root
Oct 01 13:52:46 compute-0 sudo[191492]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jedvytrdrcxmwtuxeprtscwuwotbuvnz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326765.5889535-3385-67273840917583/AnsiballZ_copy.py'
Oct 01 13:52:46 compute-0 sudo[191492]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:52:46 compute-0 python3.9[191494]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759326765.5889535-3385-67273840917583/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:52:46 compute-0 sudo[191492]: pam_unix(sudo:session): session closed for user root
Oct 01 13:52:46 compute-0 sudo[191568]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lovijtutrbspgnfshkfkmyrrrsjsvved ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326765.5889535-3385-67273840917583/AnsiballZ_systemd.py'
Oct 01 13:52:46 compute-0 sudo[191568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:52:47 compute-0 python3.9[191570]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 01 13:52:47 compute-0 systemd[1]: Reloading.
Oct 01 13:52:47 compute-0 systemd-sysv-generator[191595]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:52:47 compute-0 systemd-rc-local-generator[191589]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:52:47 compute-0 sudo[191568]: pam_unix(sudo:session): session closed for user root
Oct 01 13:52:47 compute-0 sudo[191680]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-occyurbdikrgzqscpysbtfhifduyrkrz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326765.5889535-3385-67273840917583/AnsiballZ_systemd.py'
Oct 01 13:52:47 compute-0 sudo[191680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:52:48 compute-0 python3.9[191682]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 01 13:52:48 compute-0 systemd[1]: Reloading.
Oct 01 13:52:48 compute-0 systemd-sysv-generator[191714]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:52:48 compute-0 systemd-rc-local-generator[191707]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:52:48 compute-0 systemd[1]: Starting nova_compute container...
Oct 01 13:52:48 compute-0 systemd[1]: Started libcrun container.
Oct 01 13:52:48 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6037f3854cefbcd640a796b0bcc7bca973a3891b31b650bbbcc01322861589f4/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Oct 01 13:52:48 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6037f3854cefbcd640a796b0bcc7bca973a3891b31b650bbbcc01322861589f4/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Oct 01 13:52:48 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6037f3854cefbcd640a796b0bcc7bca973a3891b31b650bbbcc01322861589f4/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Oct 01 13:52:48 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6037f3854cefbcd640a796b0bcc7bca973a3891b31b650bbbcc01322861589f4/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct 01 13:52:48 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6037f3854cefbcd640a796b0bcc7bca973a3891b31b650bbbcc01322861589f4/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Oct 01 13:52:48 compute-0 podman[191722]: 2025-10-01 13:52:48.966129286 +0000 UTC m=+0.249896084 container init 7c962c375a8c736fe18d835c9da6241e095f9b62607ec3178c17c8d98a1d1cdf (image=38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, config_data={'image': '38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=nova_compute, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible)
Oct 01 13:52:48 compute-0 podman[191722]: 2025-10-01 13:52:48.978646072 +0000 UTC m=+0.262412810 container start 7c962c375a8c736fe18d835c9da6241e095f9b62607ec3178c17c8d98a1d1cdf (image=38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': '38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=nova_compute, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=edpm)
Oct 01 13:52:48 compute-0 nova_compute[191737]: + sudo -E kolla_set_configs
Oct 01 13:52:49 compute-0 podman[191722]: nova_compute
Oct 01 13:52:49 compute-0 systemd[1]: Started nova_compute container.
Oct 01 13:52:49 compute-0 sudo[191680]: pam_unix(sudo:session): session closed for user root
Oct 01 13:52:49 compute-0 nova_compute[191737]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 01 13:52:49 compute-0 nova_compute[191737]: INFO:__main__:Validating config file
Oct 01 13:52:49 compute-0 nova_compute[191737]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 01 13:52:49 compute-0 nova_compute[191737]: INFO:__main__:Copying service configuration files
Oct 01 13:52:49 compute-0 nova_compute[191737]: INFO:__main__:Deleting /etc/nova/nova.conf
Oct 01 13:52:49 compute-0 nova_compute[191737]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Oct 01 13:52:49 compute-0 nova_compute[191737]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Oct 01 13:52:49 compute-0 nova_compute[191737]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Oct 01 13:52:49 compute-0 nova_compute[191737]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Oct 01 13:52:49 compute-0 nova_compute[191737]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Oct 01 13:52:49 compute-0 nova_compute[191737]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Oct 01 13:52:49 compute-0 nova_compute[191737]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Oct 01 13:52:49 compute-0 nova_compute[191737]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Oct 01 13:52:49 compute-0 nova_compute[191737]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct 01 13:52:49 compute-0 nova_compute[191737]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct 01 13:52:49 compute-0 nova_compute[191737]: INFO:__main__:Deleting /etc/ceph
Oct 01 13:52:49 compute-0 nova_compute[191737]: INFO:__main__:Creating directory /etc/ceph
Oct 01 13:52:49 compute-0 nova_compute[191737]: INFO:__main__:Setting permission for /etc/ceph
Oct 01 13:52:49 compute-0 nova_compute[191737]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Oct 01 13:52:49 compute-0 nova_compute[191737]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Oct 01 13:52:49 compute-0 nova_compute[191737]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Oct 01 13:52:49 compute-0 nova_compute[191737]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Oct 01 13:52:49 compute-0 nova_compute[191737]: INFO:__main__:Writing out command to execute
Oct 01 13:52:49 compute-0 nova_compute[191737]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Oct 01 13:52:49 compute-0 nova_compute[191737]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Oct 01 13:52:49 compute-0 nova_compute[191737]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Oct 01 13:52:49 compute-0 nova_compute[191737]: ++ cat /run_command
Oct 01 13:52:49 compute-0 nova_compute[191737]: + CMD=nova-compute
Oct 01 13:52:49 compute-0 nova_compute[191737]: + ARGS=
Oct 01 13:52:49 compute-0 nova_compute[191737]: + sudo kolla_copy_cacerts
Oct 01 13:52:49 compute-0 nova_compute[191737]: + [[ ! -n '' ]]
Oct 01 13:52:49 compute-0 nova_compute[191737]: + . kolla_extend_start
Oct 01 13:52:49 compute-0 nova_compute[191737]: + echo 'Running command: '\''nova-compute'\'''
Oct 01 13:52:49 compute-0 nova_compute[191737]: Running command: 'nova-compute'
Oct 01 13:52:49 compute-0 nova_compute[191737]: + umask 0022
Oct 01 13:52:49 compute-0 nova_compute[191737]: + exec nova-compute
Oct 01 13:52:50 compute-0 python3.9[191898]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 01 13:52:50 compute-0 podman[192022]: 2025-10-01 13:52:50.966520786 +0000 UTC m=+0.077140679 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct 01 13:52:51 compute-0 podman[192023]: 2025-10-01 13:52:51.003097418 +0000 UTC m=+0.114495391 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct 01 13:52:51 compute-0 python3.9[192075]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 01 13:52:51 compute-0 nova_compute[191737]: 2025-10-01 13:52:51.232 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Oct 01 13:52:51 compute-0 nova_compute[191737]: 2025-10-01 13:52:51.232 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Oct 01 13:52:51 compute-0 nova_compute[191737]: 2025-10-01 13:52:51.232 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Oct 01 13:52:51 compute-0 nova_compute[191737]: 2025-10-01 13:52:51.233 2 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Oct 01 13:52:51 compute-0 nova_compute[191737]: 2025-10-01 13:52:51.415 2 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 13:52:51 compute-0 nova_compute[191737]: 2025-10-01 13:52:51.437 2 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.023s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 13:52:51 compute-0 nova_compute[191737]: 2025-10-01 13:52:51.472 2 INFO oslo_service.periodic_task [-] Skipping periodic task _heal_instance_info_cache because its interval is negative
Oct 01 13:52:51 compute-0 nova_compute[191737]: 2025-10-01 13:52:51.474 2 WARNING oslo_config.cfg [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] Deprecated: Option "heartbeat_in_pthread" from group "oslo_messaging_rabbit" is deprecated for removal (The option is related to Eventlet which will be removed. In addition this has never worked as expected with services using eventlet for core service framework.).  Its value may be silently ignored in the future.
Oct 01 13:52:52 compute-0 python3.9[192245]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 01 13:52:52 compute-0 sudo[192395]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hoeizybewsxemnjtjnvkpvfzbglwgfti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326772.350444-3505-115892718701143/AnsiballZ_podman_container.py'
Oct 01 13:52:52 compute-0 sudo[192395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.060 2 INFO nova.virt.driver [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Oct 01 13:52:53 compute-0 python3.9[192397]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.182 2 INFO nova.compute.provider_config [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Oct 01 13:52:53 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 01 13:52:53 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 01 13:52:53 compute-0 sudo[192395]: pam_unix(sudo:session): session closed for user root
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.689 2 DEBUG oslo_concurrency.lockutils [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.690 2 DEBUG oslo_concurrency.lockutils [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.690 2 DEBUG oslo_concurrency.lockutils [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.691 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/service.py:274
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.691 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2804
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.692 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2805
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.692 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2806
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.693 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2807
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.693 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2809
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.693 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.694 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.694 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.694 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.695 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.695 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.695 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cell_worker_thread_pool_size   = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.696 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.696 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.696 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.697 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.697 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.697 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.697 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.698 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.698 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.698 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.699 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.699 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.699 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.700 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.700 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.700 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] default_green_pool_size        = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.701 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.701 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.701 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] default_thread_pool_size       = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.702 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.702 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.702 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] fatal_deprecations             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.702 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.703 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.703 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.703 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.703 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] heal_instance_info_cache_interval = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.704 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.704 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.704 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.705 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.705 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] injected_network_template      = /usr/lib/python3.12/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.705 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.706 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.706 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.706 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.707 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.707 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.707 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.707 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.708 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.708 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] key                            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.708 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.709 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] log_color                      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.709 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.709 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.709 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.710 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.710 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.710 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.711 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.711 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.711 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.711 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.712 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.712 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.712 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.713 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.713 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.713 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.714 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.714 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.714 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.714 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.715 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.715 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.715 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.716 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.716 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.716 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] my_shared_fs_storage_ip        = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.717 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.717 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.717 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.717 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.718 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.718 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.718 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.719 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.719 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.719 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] pybasedir                      = /usr/lib/python3.12/site-packages log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.719 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.720 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.720 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.720 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.721 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.721 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.721 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] record                         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.722 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.722 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.722 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.723 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.723 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.723 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.723 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.724 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.724 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.724 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.725 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.725 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.725 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.725 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.726 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.726 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.726 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.727 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.727 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.727 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] shell_completion               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.728 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.728 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.728 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.729 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.729 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.729 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.730 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.730 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.730 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.731 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.731 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] thread_pool_statistic_period   = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.731 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.732 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.732 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.733 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.733 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.733 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.733 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.733 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.734 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.734 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.734 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.734 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.734 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.734 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.735 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.735 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.735 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.735 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.735 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] os_brick.lock_path             = /var/lib/nova/tmp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.736 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.736 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.736 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.737 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.737 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.737 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.737 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.737 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.738 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.738 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.738 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.738 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.738 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.738 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.739 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.739 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.739 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.739 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.739 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.740 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] api.neutron_default_project_id = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.740 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] api.response_validation        = warn log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.740 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.740 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.740 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.740 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.741 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.741 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.741 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.741 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.741 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.742 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.742 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cache.backend_expiration_time  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.742 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.742 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.742 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.742 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.743 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.743 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.743 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cache.enforce_fips_mode        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.743 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.743 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.743 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.744 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.744 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cache.memcache_password        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.744 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.744 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.744 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.745 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.745 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.745 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.745 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.745 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cache.memcache_username        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.745 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.746 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cache.redis_db                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.746 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cache.redis_password           = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.746 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cache.redis_sentinel_service_name = mymaster log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.746 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cache.redis_sentinels          = ['localhost:26379'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.746 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cache.redis_server             = localhost:6379 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.747 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cache.redis_socket_timeout     = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.747 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cache.redis_username           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.747 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.747 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.747 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.747 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.747 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.748 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.748 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.748 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.748 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.748 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.749 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.749 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.749 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.749 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.749 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.749 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.750 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.750 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.750 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.750 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.750 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.750 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.751 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.751 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.751 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.751 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.751 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.751 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.752 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.752 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.752 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.752 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.752 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.752 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.752 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.753 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] compute.sharing_providers_max_uuids_per_request = 200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.753 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.753 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.753 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.753 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.754 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.754 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.754 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] consoleauth.enforce_session_timeout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.754 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.754 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.754 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.755 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.755 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.755 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.755 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.755 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.756 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.756 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.756 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.756 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.756 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cyborg.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.756 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.757 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.757 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.757 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.757 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.758 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.758 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.758 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.758 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] database.asyncio_connection    = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.758 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] database.asyncio_slave_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.758 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.759 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.759 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.759 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.759 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.759 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.760 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.760 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.760 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.760 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.760 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.760 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.761 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.761 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.761 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.761 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.761 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.761 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.762 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.762 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] api_database.asyncio_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.762 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] api_database.asyncio_slave_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.762 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.762 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.762 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.763 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.763 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.763 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.763 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.763 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.763 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.764 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.764 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.764 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.764 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.764 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.765 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.765 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.765 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.765 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.765 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.765 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.766 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.766 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] ephemeral_storage_encryption.default_format = luks log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.766 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.766 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.766 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.766 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.767 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.767 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.767 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.767 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.767 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.767 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.768 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.768 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.768 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.768 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.770 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.770 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.770 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.770 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.770 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.771 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.771 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.771 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.771 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.771 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] glance.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.771 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.771 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.771 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.771 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.772 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.772 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.772 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.772 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.772 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.772 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.772 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] manila.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.773 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] manila.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.773 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] manila.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.773 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] manila.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.773 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] manila.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.773 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] manila.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.773 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] manila.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.773 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] manila.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.773 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] manila.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.774 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] manila.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.774 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] manila.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.774 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] manila.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.774 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] manila.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.774 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] manila.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.774 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] manila.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.774 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] manila.service_type            = shared-file-system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.774 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] manila.share_apply_policy_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.775 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] manila.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.775 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] manila.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.775 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] manila.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.775 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] manila.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.775 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] manila.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.775 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] manila.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.775 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.776 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.776 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.776 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.776 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.776 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.776 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.776 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.776 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.777 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.777 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.777 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.777 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.777 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.777 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.777 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] ironic.conductor_group         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.777 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.778 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.778 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.778 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.778 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.778 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.778 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.778 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.778 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.779 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] ironic.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.779 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.779 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.779 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.779 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] ironic.shard                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.779 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.779 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.779 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.779 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.780 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.780 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.780 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.780 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.780 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.780 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.780 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.781 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.781 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.781 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.781 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.781 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.781 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.781 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.781 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.782 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.782 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.782 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.782 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.782 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.782 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.782 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.782 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.782 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.783 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.783 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.783 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.783 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.783 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.783 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.783 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vault.approle_role_id          = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.783 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vault.approle_secret_id        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.784 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.784 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vault.kv_path                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.784 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.784 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.784 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vault.root_token_id            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.784 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.784 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vault.timeout                  = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.784 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.785 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.785 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.785 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.785 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.785 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.785 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.785 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.785 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.786 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.786 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.786 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.786 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.786 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] keystone.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.786 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.786 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.786 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.786 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.787 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.787 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.787 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.787 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.787 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.ceph_mount_options     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.787 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.ceph_mount_point_base  = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.787 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.788 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.788 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.788 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.788 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.788 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.788 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.788 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.789 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.789 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.789 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.789 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.789 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.789 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.790 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.790 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.790 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.790 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.790 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.790 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.790 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.791 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.791 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.791 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.791 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.791 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.791 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.791 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.791 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.792 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.792 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.792 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.792 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.792 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.792 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.792 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.792 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.793 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.793 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.793 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.793 2 WARNING oslo_config.cfg [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Oct 01 13:52:53 compute-0 nova_compute[191737]: live_migration_uri is deprecated for removal in favor of two other options that
Oct 01 13:52:53 compute-0 nova_compute[191737]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Oct 01 13:52:53 compute-0 nova_compute[191737]: and ``live_migration_inbound_addr`` respectively.
Oct 01 13:52:53 compute-0 nova_compute[191737]: ).  Its value may be silently ignored in the future.
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.793 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.793 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.793 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.794 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.794 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.migration_inbound_addr = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.794 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.794 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.794 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.794 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.794 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.795 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.795 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.795 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.795 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.795 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.795 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.795 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.795 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.796 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.796 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.796 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.796 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.796 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.796 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.796 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.796 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.797 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.797 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.797 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.797 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.797 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.797 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.797 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.798 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.798 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.798 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.798 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.798 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.798 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.tb_cache_size          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.798 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.798 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.799 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.799 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.799 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.799 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.799 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.volume_enforce_multipath = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.799 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.799 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.799 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.800 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.800 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.800 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.800 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.800 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.800 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.800 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.801 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.801 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.801 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.801 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.801 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.801 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.801 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.801 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.801 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.802 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.802 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.802 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.802 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.802 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.802 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.802 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.802 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.803 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.803 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] neutron.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.803 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.803 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.803 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.803 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.803 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.803 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.804 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.804 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.804 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.804 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.804 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.804 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] notifications.include_share_mapping = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.804 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] notifications.notification_format = both log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.805 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] notifications.notify_on_state_change = vm_and_task_state log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.805 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.805 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.805 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.805 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.805 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.805 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.806 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.806 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.806 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.806 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.806 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.806 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.807 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.807 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.807 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.807 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.807 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.807 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.807 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.808 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.808 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.808 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.808 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.808 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.808 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.809 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.809 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.809 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] placement.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.809 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.809 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.809 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.809 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.809 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.810 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.810 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.810 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.810 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.810 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.810 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.810 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.811 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.811 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.811 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.811 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.811 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.811 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.811 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.811 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.812 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.812 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.812 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.812 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.812 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.812 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.812 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.812 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] quota.unified_limits_resource_list = ['servers'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.813 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] quota.unified_limits_resource_strategy = require log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.813 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.813 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.813 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.813 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.813 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.813 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.813 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.814 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.814 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.814 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.814 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.814 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.814 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.814 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.814 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.815 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.815 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.815 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.815 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.815 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] filter_scheduler.hypervisor_version_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.815 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.815 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] filter_scheduler.image_props_weight_multiplier = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.815 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] filter_scheduler.image_props_weight_setting = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.816 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.816 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.816 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.816 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.816 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.816 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] filter_scheduler.num_instances_weight_multiplier = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.816 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.816 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.817 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.817 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.817 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.817 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.817 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.817 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.817 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.817 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.818 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.818 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.818 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.818 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.818 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.818 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.818 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.819 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.819 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.819 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.819 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.819 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.819 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.819 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.819 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.819 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.820 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.820 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.820 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.820 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.820 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.820 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.820 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.821 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.821 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.821 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.821 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.821 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] spice.require_secure           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.821 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.821 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.821 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] spice.spice_direct_proxy_base_url = http://127.0.0.1:13002/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.822 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.822 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.822 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.822 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.822 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.822 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.822 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.822 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.823 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.823 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.823 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.823 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.823 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.823 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.823 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.823 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.824 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.824 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.824 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.824 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.824 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.824 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.824 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.825 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.825 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.825 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.825 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.825 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.825 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.825 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.825 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.825 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.826 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.826 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.826 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.826 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.826 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.826 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.826 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.827 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.827 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.827 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.827 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.827 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.827 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.828 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.828 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.828 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.828 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.828 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.828 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.828 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.829 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.829 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.829 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.829 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.829 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.829 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.830 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.830 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.830 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.830 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.830 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.830 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.830 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.831 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.831 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.831 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.831 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.831 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.831 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.831 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.831 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.832 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.832 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.832 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.832 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.832 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.832 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.833 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.833 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.833 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.833 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.833 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.833 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.833 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.834 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.834 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.834 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_messaging_rabbit.hostname = compute-0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.834 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.834 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.834 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.834 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.834 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_splay = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.835 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_messaging_rabbit.processname = nova-compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.835 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.835 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.835 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.835 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.835 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.835 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.835 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.836 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.836 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.836 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.836 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_messaging_rabbit.rabbit_stream_fanout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.836 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.836 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_messaging_rabbit.rabbit_transient_quorum_queue = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.836 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.836 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.836 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.837 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.837 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.837 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.837 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.837 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_messaging_rabbit.use_queue_manager = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.837 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.837 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.837 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.838 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.838 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.838 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.838 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.838 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.838 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.838 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.838 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.839 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.839 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.839 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.839 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.839 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.839 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.839 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_limit.endpoint_interface  = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.839 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.840 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_limit.endpoint_region_name = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.840 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_limit.endpoint_service_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.840 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_limit.endpoint_service_type = compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.840 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.840 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.840 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.840 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.840 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.841 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.841 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.841 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.841 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.841 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.841 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_limit.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.841 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.841 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.841 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.842 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.842 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.842 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.842 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.842 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.842 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.842 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.842 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.843 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.843 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.843 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.843 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.843 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.843 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.843 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.843 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.844 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.844 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vif_plug_linux_bridge_privileged.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.844 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.844 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.844 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.844 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.844 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.844 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.845 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vif_plug_ovs_privileged.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.845 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.845 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.845 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.845 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.845 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.845 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.845 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.846 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.846 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.846 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.846 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.846 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] os_vif_ovs.default_qos_type    = linux-noop log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.846 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.846 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.846 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.847 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.847 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.847 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.847 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] privsep_osbrick.capabilities   = [21, 2] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.847 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.847 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.847 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] privsep_osbrick.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.847 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.848 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.848 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.848 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.848 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.848 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.848 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] nova_sys_admin.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.848 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.848 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.849 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.849 2 DEBUG oslo_service.backend._eventlet.service [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2828
Oct 01 13:52:53 compute-0 nova_compute[191737]: 2025-10-01 13:52:53.850 2 INFO nova.service [-] Starting compute node (version 32.1.0-0.20250919142712.b99a882.el10)
Oct 01 13:52:53 compute-0 sudo[192573]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-imdfbsxqoktnokwcyiekgepumxlptdbc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326773.5121527-3521-214018821241069/AnsiballZ_systemd.py'
Oct 01 13:52:53 compute-0 sudo[192573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:52:54 compute-0 python3.9[192575]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 01 13:52:54 compute-0 nova_compute[191737]: 2025-10-01 13:52:54.359 2 DEBUG nova.virt.libvirt.host [None req-0f66a62a-a2cc-4e3c-b980-5f0ce2f4cb8b - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:498
Oct 01 13:52:54 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Oct 01 13:52:54 compute-0 systemd[1]: Started libvirt QEMU daemon.
Oct 01 13:52:54 compute-0 systemd[1]: Stopping nova_compute container...
Oct 01 13:52:54 compute-0 nova_compute[191737]: 2025-10-01 13:52:54.462 2 DEBUG nova.virt.libvirt.host [None req-0f66a62a-a2cc-4e3c-b980-5f0ce2f4cb8b - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f60c1f2f710> _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:504
Oct 01 13:52:54 compute-0 nova_compute[191737]: libvirt:  error : internal error: could not initialize domain event timer
Oct 01 13:52:54 compute-0 nova_compute[191737]: 2025-10-01 13:52:54.464 2 WARNING nova.virt.libvirt.host [None req-0f66a62a-a2cc-4e3c-b980-5f0ce2f4cb8b - - - - - -] URI qemu:///system does not support events: internal error: could not initialize domain event timer: libvirt.libvirtError: internal error: could not initialize domain event timer
Oct 01 13:52:54 compute-0 nova_compute[191737]: 2025-10-01 13:52:54.465 2 DEBUG nova.virt.libvirt.host [None req-0f66a62a-a2cc-4e3c-b980-5f0ce2f4cb8b - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f60c1f2f710> _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:525
Oct 01 13:52:54 compute-0 nova_compute[191737]: 2025-10-01 13:52:54.468 2 DEBUG nova.virt.libvirt.host [None req-0f66a62a-a2cc-4e3c-b980-5f0ce2f4cb8b - - - - - -] Starting native event thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:484
Oct 01 13:52:54 compute-0 nova_compute[191737]: 2025-10-01 13:52:54.469 2 DEBUG nova.virt.libvirt.host [None req-0f66a62a-a2cc-4e3c-b980-5f0ce2f4cb8b - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:490
Oct 01 13:52:54 compute-0 nova_compute[191737]: 2025-10-01 13:52:54.470 2 INFO nova.utils [None req-0f66a62a-a2cc-4e3c-b980-5f0ce2f4cb8b - - - - - -] The default thread pool MainProcess.default is initialized
Oct 01 13:52:54 compute-0 nova_compute[191737]: 2025-10-01 13:52:54.470 2 DEBUG nova.virt.libvirt.host [None req-0f66a62a-a2cc-4e3c-b980-5f0ce2f4cb8b - - - - - -] Starting connection event dispatch thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:493
Oct 01 13:52:54 compute-0 nova_compute[191737]: 2025-10-01 13:52:54.472 2 INFO nova.virt.libvirt.driver [None req-0f66a62a-a2cc-4e3c-b980-5f0ce2f4cb8b - - - - - -] Connection event '1' reason 'None'
Oct 01 13:52:54 compute-0 nova_compute[191737]: 2025-10-01 13:52:54.510 2 DEBUG oslo_concurrency.lockutils [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 01 13:52:54 compute-0 nova_compute[191737]: 2025-10-01 13:52:54.511 2 DEBUG oslo_concurrency.lockutils [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 01 13:52:54 compute-0 nova_compute[191737]: 2025-10-01 13:52:54.511 2 DEBUG oslo_concurrency.lockutils [None req-a9b31794-191d-4918-9e76-9c0588c56659 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 01 13:52:55 compute-0 nova_compute[191737]: 2025-10-01 13:52:55.499 2 WARNING nova.virt.libvirt.driver [None req-0f66a62a-a2cc-4e3c-b980-5f0ce2f4cb8b - - - - - -] Cannot update service status on host "compute-0.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Oct 01 13:52:55 compute-0 nova_compute[191737]: 2025-10-01 13:52:55.501 2 DEBUG nova.virt.libvirt.volume.mount [None req-0f66a62a-a2cc-4e3c-b980-5f0ce2f4cb8b - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/mount.py:130
Oct 01 13:52:56 compute-0 virtqemud[192597]: libvirt version: 10.10.0, package: 15.el9 (builder@centos.org, 2025-08-18-13:22:20, )
Oct 01 13:52:56 compute-0 virtqemud[192597]: hostname: compute-0
Oct 01 13:52:56 compute-0 virtqemud[192597]: End of file while reading data: Input/output error
Oct 01 13:52:56 compute-0 systemd[1]: libpod-7c962c375a8c736fe18d835c9da6241e095f9b62607ec3178c17c8d98a1d1cdf.scope: Deactivated successfully.
Oct 01 13:52:56 compute-0 systemd[1]: libpod-7c962c375a8c736fe18d835c9da6241e095f9b62607ec3178c17c8d98a1d1cdf.scope: Consumed 3.738s CPU time.
Oct 01 13:52:56 compute-0 podman[192617]: 2025-10-01 13:52:56.305677413 +0000 UTC m=+1.840469757 container died 7c962c375a8c736fe18d835c9da6241e095f9b62607ec3178c17c8d98a1d1cdf (image=38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, container_name=nova_compute, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, config_data={'image': '38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true)
Oct 01 13:52:56 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7c962c375a8c736fe18d835c9da6241e095f9b62607ec3178c17c8d98a1d1cdf-userdata-shm.mount: Deactivated successfully.
Oct 01 13:52:56 compute-0 systemd[1]: var-lib-containers-storage-overlay-6037f3854cefbcd640a796b0bcc7bca973a3891b31b650bbbcc01322861589f4-merged.mount: Deactivated successfully.
Oct 01 13:52:56 compute-0 podman[192617]: 2025-10-01 13:52:56.373281992 +0000 UTC m=+1.908074336 container cleanup 7c962c375a8c736fe18d835c9da6241e095f9b62607ec3178c17c8d98a1d1cdf (image=38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=edpm, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_data={'image': '38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct 01 13:52:56 compute-0 podman[192617]: nova_compute
Oct 01 13:52:56 compute-0 podman[192670]: nova_compute
Oct 01 13:52:56 compute-0 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Oct 01 13:52:56 compute-0 systemd[1]: Stopped nova_compute container.
Oct 01 13:52:56 compute-0 systemd[1]: Starting nova_compute container...
Oct 01 13:52:56 compute-0 systemd[1]: Started libcrun container.
Oct 01 13:52:56 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6037f3854cefbcd640a796b0bcc7bca973a3891b31b650bbbcc01322861589f4/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Oct 01 13:52:56 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6037f3854cefbcd640a796b0bcc7bca973a3891b31b650bbbcc01322861589f4/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Oct 01 13:52:56 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6037f3854cefbcd640a796b0bcc7bca973a3891b31b650bbbcc01322861589f4/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Oct 01 13:52:56 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6037f3854cefbcd640a796b0bcc7bca973a3891b31b650bbbcc01322861589f4/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct 01 13:52:56 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6037f3854cefbcd640a796b0bcc7bca973a3891b31b650bbbcc01322861589f4/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Oct 01 13:52:56 compute-0 podman[192683]: 2025-10-01 13:52:56.665727413 +0000 UTC m=+0.143181117 container init 7c962c375a8c736fe18d835c9da6241e095f9b62607ec3178c17c8d98a1d1cdf (image=38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_data={'image': '38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, container_name=nova_compute, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct 01 13:52:56 compute-0 podman[192683]: 2025-10-01 13:52:56.677948051 +0000 UTC m=+0.155401715 container start 7c962c375a8c736fe18d835c9da6241e095f9b62607ec3178c17c8d98a1d1cdf (image=38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'image': '38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, container_name=nova_compute, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_build_tag=watcher_latest)
Oct 01 13:52:56 compute-0 podman[192683]: nova_compute
Oct 01 13:52:56 compute-0 nova_compute[192698]: + sudo -E kolla_set_configs
Oct 01 13:52:56 compute-0 systemd[1]: Started nova_compute container.
Oct 01 13:52:56 compute-0 sudo[192573]: pam_unix(sudo:session): session closed for user root
Oct 01 13:52:56 compute-0 nova_compute[192698]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct 01 13:52:56 compute-0 nova_compute[192698]: INFO:__main__:Validating config file
Oct 01 13:52:56 compute-0 nova_compute[192698]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct 01 13:52:56 compute-0 nova_compute[192698]: INFO:__main__:Copying service configuration files
Oct 01 13:52:56 compute-0 nova_compute[192698]: INFO:__main__:Deleting /etc/nova/nova.conf
Oct 01 13:52:56 compute-0 nova_compute[192698]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Oct 01 13:52:56 compute-0 nova_compute[192698]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Oct 01 13:52:56 compute-0 nova_compute[192698]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Oct 01 13:52:56 compute-0 nova_compute[192698]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Oct 01 13:52:56 compute-0 nova_compute[192698]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Oct 01 13:52:56 compute-0 nova_compute[192698]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Oct 01 13:52:56 compute-0 nova_compute[192698]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Oct 01 13:52:56 compute-0 nova_compute[192698]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Oct 01 13:52:56 compute-0 nova_compute[192698]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Oct 01 13:52:56 compute-0 nova_compute[192698]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Oct 01 13:52:56 compute-0 nova_compute[192698]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Oct 01 13:52:56 compute-0 nova_compute[192698]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct 01 13:52:56 compute-0 nova_compute[192698]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct 01 13:52:56 compute-0 nova_compute[192698]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct 01 13:52:56 compute-0 nova_compute[192698]: INFO:__main__:Deleting /etc/ceph
Oct 01 13:52:56 compute-0 nova_compute[192698]: INFO:__main__:Creating directory /etc/ceph
Oct 01 13:52:56 compute-0 nova_compute[192698]: INFO:__main__:Setting permission for /etc/ceph
Oct 01 13:52:56 compute-0 nova_compute[192698]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Oct 01 13:52:56 compute-0 nova_compute[192698]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Oct 01 13:52:56 compute-0 nova_compute[192698]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Oct 01 13:52:56 compute-0 nova_compute[192698]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Oct 01 13:52:56 compute-0 nova_compute[192698]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Oct 01 13:52:56 compute-0 nova_compute[192698]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Oct 01 13:52:56 compute-0 nova_compute[192698]: INFO:__main__:Writing out command to execute
Oct 01 13:52:56 compute-0 nova_compute[192698]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Oct 01 13:52:56 compute-0 nova_compute[192698]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Oct 01 13:52:56 compute-0 nova_compute[192698]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Oct 01 13:52:56 compute-0 nova_compute[192698]: ++ cat /run_command
Oct 01 13:52:56 compute-0 nova_compute[192698]: + CMD=nova-compute
Oct 01 13:52:56 compute-0 nova_compute[192698]: + ARGS=
Oct 01 13:52:56 compute-0 nova_compute[192698]: + sudo kolla_copy_cacerts
Oct 01 13:52:56 compute-0 nova_compute[192698]: + [[ ! -n '' ]]
Oct 01 13:52:56 compute-0 nova_compute[192698]: + . kolla_extend_start
Oct 01 13:52:56 compute-0 nova_compute[192698]: Running command: 'nova-compute'
Oct 01 13:52:56 compute-0 nova_compute[192698]: + echo 'Running command: '\''nova-compute'\'''
Oct 01 13:52:56 compute-0 nova_compute[192698]: + umask 0022
Oct 01 13:52:56 compute-0 nova_compute[192698]: + exec nova-compute
Oct 01 13:52:57 compute-0 sudo[192859]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mkzdveziddwzmlzissjlflutfykgnpof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326777.0001016-3539-139676031011757/AnsiballZ_podman_container.py'
Oct 01 13:52:57 compute-0 sudo[192859]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:52:57 compute-0 python3.9[192861]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Oct 01 13:52:57 compute-0 systemd[1]: Started libpod-conmon-327446013c3d3270de282d60169e48adec912db676e7868a25d3c28441158816.scope.
Oct 01 13:52:57 compute-0 systemd[1]: Started libcrun container.
Oct 01 13:52:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7214bf69bf24e3bcce4927ffcd17dd105634a77f04e0600f0dea5e76a80eb1de/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Oct 01 13:52:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7214bf69bf24e3bcce4927ffcd17dd105634a77f04e0600f0dea5e76a80eb1de/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Oct 01 13:52:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7214bf69bf24e3bcce4927ffcd17dd105634a77f04e0600f0dea5e76a80eb1de/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Oct 01 13:52:58 compute-0 podman[192886]: 2025-10-01 13:52:58.030198193 +0000 UTC m=+0.188670091 container init 327446013c3d3270de282d60169e48adec912db676e7868a25d3c28441158816 (image=38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, config_data={'image': '38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, container_name=nova_compute_init, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 01 13:52:58 compute-0 podman[192886]: 2025-10-01 13:52:58.044926006 +0000 UTC m=+0.203397834 container start 327446013c3d3270de282d60169e48adec912db676e7868a25d3c28441158816 (image=38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'image': '38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930)
Oct 01 13:52:58 compute-0 python3.9[192861]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Oct 01 13:52:58 compute-0 nova_compute_init[192907]: INFO:nova_statedir:Applying nova statedir ownership
Oct 01 13:52:58 compute-0 nova_compute_init[192907]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Oct 01 13:52:58 compute-0 nova_compute_init[192907]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Oct 01 13:52:58 compute-0 nova_compute_init[192907]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Oct 01 13:52:58 compute-0 nova_compute_init[192907]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Oct 01 13:52:58 compute-0 nova_compute_init[192907]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Oct 01 13:52:58 compute-0 nova_compute_init[192907]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Oct 01 13:52:58 compute-0 nova_compute_init[192907]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Oct 01 13:52:58 compute-0 nova_compute_init[192907]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Oct 01 13:52:58 compute-0 nova_compute_init[192907]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Oct 01 13:52:58 compute-0 nova_compute_init[192907]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Oct 01 13:52:58 compute-0 nova_compute_init[192907]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Oct 01 13:52:58 compute-0 nova_compute_init[192907]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Oct 01 13:52:58 compute-0 nova_compute_init[192907]: INFO:nova_statedir:Nova statedir ownership complete
Oct 01 13:52:58 compute-0 systemd[1]: libpod-327446013c3d3270de282d60169e48adec912db676e7868a25d3c28441158816.scope: Deactivated successfully.
Oct 01 13:52:58 compute-0 podman[192908]: 2025-10-01 13:52:58.14268788 +0000 UTC m=+0.052062506 container died 327446013c3d3270de282d60169e48adec912db676e7868a25d3c28441158816 (image=38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, container_name=nova_compute_init, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'image': '38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_managed=true)
Oct 01 13:52:58 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-327446013c3d3270de282d60169e48adec912db676e7868a25d3c28441158816-userdata-shm.mount: Deactivated successfully.
Oct 01 13:52:58 compute-0 systemd[1]: var-lib-containers-storage-overlay-7214bf69bf24e3bcce4927ffcd17dd105634a77f04e0600f0dea5e76a80eb1de-merged.mount: Deactivated successfully.
Oct 01 13:52:58 compute-0 podman[192921]: 2025-10-01 13:52:58.243528235 +0000 UTC m=+0.086507133 container cleanup 327446013c3d3270de282d60169e48adec912db676e7868a25d3c28441158816 (image=38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, config_data={'image': '38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=edpm, container_name=nova_compute_init, io.buildah.version=1.41.4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Oct 01 13:52:58 compute-0 systemd[1]: libpod-conmon-327446013c3d3270de282d60169e48adec912db676e7868a25d3c28441158816.scope: Deactivated successfully.
Oct 01 13:52:58 compute-0 sudo[192859]: pam_unix(sudo:session): session closed for user root
Oct 01 13:52:58 compute-0 nova_compute[192698]: 2025-10-01 13:52:58.704 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Oct 01 13:52:58 compute-0 nova_compute[192698]: 2025-10-01 13:52:58.704 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Oct 01 13:52:58 compute-0 nova_compute[192698]: 2025-10-01 13:52:58.704 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Oct 01 13:52:58 compute-0 nova_compute[192698]: 2025-10-01 13:52:58.704 2 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Oct 01 13:52:58 compute-0 nova_compute[192698]: 2025-10-01 13:52:58.831 2 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 13:52:58 compute-0 nova_compute[192698]: 2025-10-01 13:52:58.864 2 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.033s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 13:52:58 compute-0 sshd-session[158246]: Connection closed by 192.168.122.30 port 48850
Oct 01 13:52:58 compute-0 sshd-session[158243]: pam_unix(sshd:session): session closed for user zuul
Oct 01 13:52:58 compute-0 systemd[1]: session-25.scope: Deactivated successfully.
Oct 01 13:52:58 compute-0 systemd[1]: session-25.scope: Consumed 2min 55.666s CPU time.
Oct 01 13:52:58 compute-0 systemd-logind[791]: Session 25 logged out. Waiting for processes to exit.
Oct 01 13:52:58 compute-0 systemd-logind[791]: Removed session 25.
Oct 01 13:52:58 compute-0 nova_compute[192698]: 2025-10-01 13:52:58.925 2 INFO oslo_service.periodic_task [-] Skipping periodic task _heal_instance_info_cache because its interval is negative
Oct 01 13:52:58 compute-0 nova_compute[192698]: 2025-10-01 13:52:58.928 2 WARNING oslo_config.cfg [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] Deprecated: Option "heartbeat_in_pthread" from group "oslo_messaging_rabbit" is deprecated for removal (The option is related to Eventlet which will be removed. In addition this has never worked as expected with services using eventlet for core service framework.).  Its value may be silently ignored in the future.
Oct 01 13:52:59 compute-0 nova_compute[192698]: 2025-10-01 13:52:59.920 2 INFO nova.virt.driver [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.040 2 INFO nova.compute.provider_config [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.548 2 DEBUG oslo_concurrency.lockutils [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.549 2 DEBUG oslo_concurrency.lockutils [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.549 2 DEBUG oslo_concurrency.lockutils [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.550 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/service.py:274
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.550 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2804
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.550 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2805
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.551 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2806
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.551 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2807
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.552 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2809
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.552 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.553 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.553 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.553 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.554 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.554 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.554 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cell_worker_thread_pool_size   = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.555 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.555 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.556 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.556 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.557 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.557 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.558 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.558 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.558 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.559 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.559 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.560 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.560 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.560 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.561 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.561 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] default_green_pool_size        = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.561 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.562 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.562 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] default_thread_pool_size       = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.563 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.563 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.563 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] fatal_deprecations             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.564 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.564 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.564 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.565 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.565 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] heal_instance_info_cache_interval = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.565 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.566 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.566 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.567 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.567 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] injected_network_template      = /usr/lib/python3.12/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.568 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.568 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.568 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.569 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.569 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.569 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.570 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.570 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.571 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.571 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] key                            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.571 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.572 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] log_color                      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.572 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.572 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.573 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.573 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.573 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.574 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.574 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.575 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.575 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.575 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.576 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.576 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.576 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.577 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.577 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.577 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.578 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.578 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.579 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.579 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.579 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.580 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.580 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.581 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.581 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.582 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] my_shared_fs_storage_ip        = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.582 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.582 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.583 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.583 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.583 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.584 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.584 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.584 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.584 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.585 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] pybasedir                      = /usr/lib/python3.12/site-packages log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.585 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.585 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.585 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.586 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.586 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.586 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.587 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] record                         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.587 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.587 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.587 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.588 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.588 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.588 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.589 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.589 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.589 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.590 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.590 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.590 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.591 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.591 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.591 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.592 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.592 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.593 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.593 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.593 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] shell_completion               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.593 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.594 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.594 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.594 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.595 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.595 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.595 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.595 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.596 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.596 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.596 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] thread_pool_statistic_period   = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.596 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.597 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.597 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.597 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.597 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.598 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.598 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.598 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.599 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.599 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.599 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.599 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.600 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.600 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.600 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.600 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.601 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.601 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.601 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] os_brick.lock_path             = /var/lib/nova/tmp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.602 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.602 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.602 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.602 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.603 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.603 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.603 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.604 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.604 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.604 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.605 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.605 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.605 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.605 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.606 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.606 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.606 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.606 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.607 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.607 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] api.neutron_default_project_id = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.607 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] api.response_validation        = warn log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.608 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.608 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.608 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.608 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.609 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.609 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.609 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.610 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.610 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.610 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.610 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cache.backend_expiration_time  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.611 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.611 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.611 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.612 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.612 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.612 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.612 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cache.enforce_fips_mode        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.613 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.613 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.613 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.614 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.614 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cache.memcache_password        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.614 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.614 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.615 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.615 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.615 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.615 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.616 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.616 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cache.memcache_username        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.616 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.616 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cache.redis_db                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.616 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cache.redis_password           = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.616 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cache.redis_sentinel_service_name = mymaster log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.617 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cache.redis_sentinels          = ['localhost:26379'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.617 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cache.redis_server             = localhost:6379 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.617 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cache.redis_socket_timeout     = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.617 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cache.redis_username           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.617 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.617 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.618 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.618 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.618 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.618 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.618 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.618 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.619 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.619 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.619 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.619 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.619 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.620 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.620 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.620 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.620 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.620 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.620 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.621 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.621 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.621 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.621 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.621 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.621 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.622 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.622 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.622 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.622 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.622 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.622 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.623 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.623 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.623 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.623 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.623 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] compute.sharing_providers_max_uuids_per_request = 200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.624 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.624 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.624 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.624 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.624 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.624 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.625 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] consoleauth.enforce_session_timeout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.625 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.625 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.625 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.625 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.625 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.626 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.626 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.626 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.626 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.626 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.627 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.627 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.627 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cyborg.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.627 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.627 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.627 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.628 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.628 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.628 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.628 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.628 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.628 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] database.asyncio_connection    = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.629 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] database.asyncio_slave_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.629 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.629 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.629 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.629 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.629 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.630 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.630 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.630 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.630 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.630 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.630 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.630 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.631 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.631 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.631 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.631 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.631 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.631 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.632 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.632 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] api_database.asyncio_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.632 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] api_database.asyncio_slave_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.632 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.632 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.633 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.633 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.633 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.633 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.633 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.633 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.634 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.634 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.634 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.634 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.634 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.634 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.635 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.635 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.635 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.635 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.635 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.635 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.636 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.636 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] ephemeral_storage_encryption.default_format = luks log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.636 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.636 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.636 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.636 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.637 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.637 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.637 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.637 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.637 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.637 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.638 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.638 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.639 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.639 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.640 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.640 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.640 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.640 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.640 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.641 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.641 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.641 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.641 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.641 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] glance.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.641 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.642 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.642 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.642 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.642 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.642 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.642 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.643 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.643 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.643 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.643 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] manila.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.643 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] manila.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.644 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] manila.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.644 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] manila.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.644 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] manila.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.644 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] manila.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.644 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] manila.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.644 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] manila.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.645 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] manila.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.645 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] manila.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.645 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] manila.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.645 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] manila.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.645 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] manila.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.645 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] manila.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.646 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] manila.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.646 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] manila.service_type            = shared-file-system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.646 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] manila.share_apply_policy_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.646 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] manila.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.646 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] manila.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.646 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] manila.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.647 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] manila.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.647 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] manila.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.647 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] manila.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.647 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.647 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.648 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.648 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.648 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.648 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.648 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.649 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.649 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.649 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.649 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.649 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.649 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.650 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.650 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.650 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] ironic.conductor_group         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.650 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.650 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.650 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.650 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.651 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.651 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.651 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.651 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.651 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.652 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] ironic.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.652 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.652 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.652 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.652 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] ironic.shard                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.652 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.653 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.653 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.653 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.653 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.653 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.653 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.654 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.654 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.654 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.654 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.654 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.655 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.655 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.655 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.655 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.655 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.655 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.656 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.656 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.656 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.656 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.656 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.657 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.657 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.657 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.657 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.657 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.657 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.658 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.658 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.658 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.658 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.658 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.659 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vault.approle_role_id          = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.659 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vault.approle_secret_id        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.659 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.659 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vault.kv_path                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.659 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.660 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.660 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vault.root_token_id            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.660 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.660 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vault.timeout                  = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.660 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.661 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.661 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.661 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.661 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.661 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.661 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.662 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.662 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.662 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.662 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.662 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.662 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.663 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] keystone.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.663 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.663 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.663 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.663 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.663 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.663 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.664 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.664 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.664 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.ceph_mount_options     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.664 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.ceph_mount_point_base  = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.664 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.664 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.664 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.665 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.665 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.665 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.665 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.665 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.665 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.665 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.665 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.665 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.666 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.666 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.666 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.666 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.666 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.666 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.666 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.666 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.667 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.667 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.667 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.667 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.667 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.667 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.667 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.667 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.667 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.668 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.668 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.668 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.668 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.668 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.668 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.668 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.668 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.669 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.669 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.669 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.669 2 WARNING oslo_config.cfg [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Oct 01 13:53:00 compute-0 nova_compute[192698]: live_migration_uri is deprecated for removal in favor of two other options that
Oct 01 13:53:00 compute-0 nova_compute[192698]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Oct 01 13:53:00 compute-0 nova_compute[192698]: and ``live_migration_inbound_addr`` respectively.
Oct 01 13:53:00 compute-0 nova_compute[192698]: ).  Its value may be silently ignored in the future.
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.669 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.669 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.669 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.669 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.670 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.migration_inbound_addr = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.670 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.670 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.670 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.670 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.670 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.670 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.671 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.671 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.671 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.671 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.671 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.671 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.671 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.671 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.672 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.672 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.672 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.672 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.672 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.672 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.672 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.673 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.673 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.673 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.673 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.673 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.673 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.673 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.673 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.674 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.674 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.674 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.674 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.674 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.tb_cache_size          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.674 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.674 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.674 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.675 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.675 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.675 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.675 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.volume_enforce_multipath = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.675 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.675 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.675 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.675 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.676 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.676 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.676 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.676 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.676 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.676 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.676 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.677 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.677 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.677 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.677 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.677 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.677 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.677 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.678 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.678 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.678 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.678 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.678 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.678 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.678 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.678 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.678 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.679 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.679 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] neutron.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.679 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.679 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.679 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.679 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.679 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.680 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.680 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.680 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.680 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.680 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.680 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.680 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] notifications.include_share_mapping = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.680 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] notifications.notification_format = both log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.680 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] notifications.notify_on_state_change = vm_and_task_state log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.681 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.681 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.681 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.681 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.681 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.681 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.682 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.682 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.682 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.682 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.682 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.682 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.682 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.683 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.683 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.683 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.683 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.683 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.683 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.683 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.684 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.684 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.684 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.684 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.684 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.684 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.684 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.684 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] placement.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.684 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.685 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.685 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.685 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.685 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.685 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.685 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.685 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.685 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.686 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.686 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.686 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.686 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.686 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.686 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.686 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.686 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.687 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.687 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.687 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.687 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.687 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.687 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.687 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.687 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.688 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.688 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.688 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] quota.unified_limits_resource_list = ['servers'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.688 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] quota.unified_limits_resource_strategy = require log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.688 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.688 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.688 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.689 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.689 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.689 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.689 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.689 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.689 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.689 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.689 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.690 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.690 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.690 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.690 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.690 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.690 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.690 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.691 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.691 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] filter_scheduler.hypervisor_version_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.691 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.691 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] filter_scheduler.image_props_weight_multiplier = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.691 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] filter_scheduler.image_props_weight_setting = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.691 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.691 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.691 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.692 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.692 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.692 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] filter_scheduler.num_instances_weight_multiplier = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.692 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.692 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.692 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.692 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.692 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.693 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.693 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.693 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.693 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.693 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.693 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.693 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.693 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.694 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.694 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.694 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.694 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.694 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.694 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.694 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.695 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.695 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.695 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.695 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.695 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.695 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.695 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.695 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.695 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.696 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.696 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.696 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.696 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.696 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.696 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.696 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.697 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.697 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] spice.require_secure           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.697 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.697 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.697 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] spice.spice_direct_proxy_base_url = http://127.0.0.1:13002/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.697 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.697 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.697 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.698 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.698 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.698 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.698 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.698 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.698 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.698 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.698 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.698 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.699 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.699 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.699 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.699 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.699 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.699 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.699 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.699 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.700 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.700 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.700 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.700 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.700 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.700 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.700 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.700 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.700 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.701 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.701 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.701 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.701 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.701 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.701 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.701 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.701 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.701 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.702 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.702 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.702 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.702 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.702 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.702 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.702 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.703 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.703 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.703 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.703 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.703 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.703 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.703 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.703 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.704 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.704 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.704 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.704 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.704 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.704 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.704 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.704 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.705 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.705 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.705 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.705 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.705 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.705 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.705 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.706 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.706 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.706 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.706 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.706 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.706 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.706 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.706 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.707 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.707 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.707 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.707 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.707 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.707 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.707 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.707 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.708 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.708 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.708 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.708 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.708 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_messaging_rabbit.hostname = compute-0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.708 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.708 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.708 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.708 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.709 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_splay = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.709 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_messaging_rabbit.processname = nova-compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.709 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.709 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.709 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.709 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.709 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.709 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.709 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.710 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.710 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.710 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.710 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_messaging_rabbit.rabbit_stream_fanout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.710 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.710 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_messaging_rabbit.rabbit_transient_quorum_queue = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.710 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.710 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.711 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.711 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.711 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.711 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.711 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.711 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_messaging_rabbit.use_queue_manager = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.711 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.711 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.712 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.712 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.712 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.712 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.712 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.712 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.712 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.712 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.712 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.713 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.713 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.713 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.713 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.713 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.713 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.713 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_limit.endpoint_interface  = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.713 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.713 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_limit.endpoint_region_name = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.714 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_limit.endpoint_service_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.714 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_limit.endpoint_service_type = compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.714 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.714 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.714 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.714 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.714 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.714 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.715 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.715 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.715 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.715 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.715 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_limit.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.715 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.715 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.715 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.715 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.716 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.716 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.716 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.716 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.716 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.716 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.716 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.716 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.717 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.717 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.717 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.717 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.717 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.717 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.717 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.717 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.717 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vif_plug_linux_bridge_privileged.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.718 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.718 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.718 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.718 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.718 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.718 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.718 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vif_plug_ovs_privileged.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.718 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.718 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.719 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.719 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.719 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.719 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.719 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.719 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.719 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.719 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.720 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.720 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] os_vif_ovs.default_qos_type    = linux-noop log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.720 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.720 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.720 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.720 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.720 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.720 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.721 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] privsep_osbrick.capabilities   = [21, 2] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.721 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.721 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.721 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] privsep_osbrick.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.721 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.721 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.721 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.721 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.721 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.722 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.722 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] nova_sys_admin.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.722 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.722 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.722 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.722 2 DEBUG oslo_service.backend._eventlet.service [None req-01cd1d44-3f0e-406b-8946-bcc4f54021d6 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2828
Oct 01 13:53:00 compute-0 nova_compute[192698]: 2025-10-01 13:53:00.723 2 INFO nova.service [-] Starting compute node (version 32.1.0-0.20250919142712.b99a882.el10)
Oct 01 13:53:01 compute-0 nova_compute[192698]: 2025-10-01 13:53:01.231 2 DEBUG nova.virt.libvirt.host [None req-d41344e6-afd1-4552-bc93-1f06ca4fddfe - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:498
Oct 01 13:53:01 compute-0 nova_compute[192698]: 2025-10-01 13:53:01.253 2 DEBUG nova.virt.libvirt.host [None req-d41344e6-afd1-4552-bc93-1f06ca4fddfe - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fc797e6b5c0> _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:504
Oct 01 13:53:01 compute-0 nova_compute[192698]: libvirt:  error : internal error: could not initialize domain event timer
Oct 01 13:53:01 compute-0 nova_compute[192698]: 2025-10-01 13:53:01.254 2 WARNING nova.virt.libvirt.host [None req-d41344e6-afd1-4552-bc93-1f06ca4fddfe - - - - - -] URI qemu:///system does not support events: internal error: could not initialize domain event timer: libvirt.libvirtError: internal error: could not initialize domain event timer
Oct 01 13:53:01 compute-0 nova_compute[192698]: 2025-10-01 13:53:01.255 2 DEBUG nova.virt.libvirt.host [None req-d41344e6-afd1-4552-bc93-1f06ca4fddfe - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fc797e6b5c0> _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:525
Oct 01 13:53:01 compute-0 nova_compute[192698]: 2025-10-01 13:53:01.259 2 DEBUG nova.virt.libvirt.host [None req-d41344e6-afd1-4552-bc93-1f06ca4fddfe - - - - - -] Starting native event thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:484
Oct 01 13:53:01 compute-0 nova_compute[192698]: 2025-10-01 13:53:01.259 2 DEBUG nova.virt.libvirt.host [None req-d41344e6-afd1-4552-bc93-1f06ca4fddfe - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:490
Oct 01 13:53:01 compute-0 nova_compute[192698]: 2025-10-01 13:53:01.260 2 INFO nova.utils [None req-d41344e6-afd1-4552-bc93-1f06ca4fddfe - - - - - -] The default thread pool MainProcess.default is initialized
Oct 01 13:53:01 compute-0 nova_compute[192698]: 2025-10-01 13:53:01.260 2 DEBUG nova.virt.libvirt.host [None req-d41344e6-afd1-4552-bc93-1f06ca4fddfe - - - - - -] Starting connection event dispatch thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:493
Oct 01 13:53:01 compute-0 nova_compute[192698]: 2025-10-01 13:53:01.261 2 INFO nova.virt.libvirt.driver [None req-d41344e6-afd1-4552-bc93-1f06ca4fddfe - - - - - -] Connection event '1' reason 'None'
Oct 01 13:53:01 compute-0 nova_compute[192698]: 2025-10-01 13:53:01.275 2 INFO nova.virt.libvirt.host [None req-d41344e6-afd1-4552-bc93-1f06ca4fddfe - - - - - -] Libvirt host capabilities <capabilities>
Oct 01 13:53:01 compute-0 nova_compute[192698]: 
Oct 01 13:53:01 compute-0 nova_compute[192698]:   <host>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <uuid>3609044e-8b89-4b90-b4b0-db5bb7be664b</uuid>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <cpu>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <arch>x86_64</arch>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model>EPYC-Rome-v4</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <vendor>AMD</vendor>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <microcode version='16777317'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <signature family='23' model='49' stepping='0'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <maxphysaddr mode='emulate' bits='40'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature name='x2apic'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature name='tsc-deadline'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature name='osxsave'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature name='hypervisor'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature name='tsc_adjust'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature name='spec-ctrl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature name='stibp'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature name='arch-capabilities'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature name='ssbd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature name='cmp_legacy'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature name='topoext'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature name='virt-ssbd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature name='lbrv'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature name='tsc-scale'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature name='vmcb-clean'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature name='pause-filter'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature name='pfthreshold'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature name='svme-addr-chk'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature name='rdctl-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature name='skip-l1dfl-vmentry'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature name='mds-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature name='pschange-mc-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <pages unit='KiB' size='4'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <pages unit='KiB' size='2048'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <pages unit='KiB' size='1048576'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </cpu>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <power_management>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <suspend_mem/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <suspend_disk/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <suspend_hybrid/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </power_management>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <iommu support='no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <migration_features>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <live/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <uri_transports>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <uri_transport>tcp</uri_transport>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <uri_transport>rdma</uri_transport>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </uri_transports>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </migration_features>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <topology>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <cells num='1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <cell id='0'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:           <memory unit='KiB'>7864120</memory>
Oct 01 13:53:01 compute-0 nova_compute[192698]:           <pages unit='KiB' size='4'>1966030</pages>
Oct 01 13:53:01 compute-0 nova_compute[192698]:           <pages unit='KiB' size='2048'>0</pages>
Oct 01 13:53:01 compute-0 nova_compute[192698]:           <pages unit='KiB' size='1048576'>0</pages>
Oct 01 13:53:01 compute-0 nova_compute[192698]:           <distances>
Oct 01 13:53:01 compute-0 nova_compute[192698]:             <sibling id='0' value='10'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:           </distances>
Oct 01 13:53:01 compute-0 nova_compute[192698]:           <cpus num='8'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:           </cpus>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         </cell>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </cells>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </topology>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <cache>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </cache>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <secmodel>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model>selinux</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <doi>0</doi>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </secmodel>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <secmodel>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model>dac</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <doi>0</doi>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <baselabel type='kvm'>+107:+107</baselabel>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <baselabel type='qemu'>+107:+107</baselabel>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </secmodel>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   </host>
Oct 01 13:53:01 compute-0 nova_compute[192698]: 
Oct 01 13:53:01 compute-0 nova_compute[192698]:   <guest>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <os_type>hvm</os_type>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <arch name='i686'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <wordsize>32</wordsize>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <domain type='qemu'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <domain type='kvm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </arch>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <features>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <pae/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <nonpae/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <acpi default='on' toggle='yes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <apic default='on' toggle='no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <cpuselection/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <deviceboot/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <disksnapshot default='on' toggle='no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <externalSnapshot/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </features>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   </guest>
Oct 01 13:53:01 compute-0 nova_compute[192698]: 
Oct 01 13:53:01 compute-0 nova_compute[192698]:   <guest>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <os_type>hvm</os_type>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <arch name='x86_64'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <wordsize>64</wordsize>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <domain type='qemu'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <domain type='kvm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </arch>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <features>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <acpi default='on' toggle='yes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <apic default='on' toggle='no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <cpuselection/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <deviceboot/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <disksnapshot default='on' toggle='no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <externalSnapshot/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </features>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   </guest>
Oct 01 13:53:01 compute-0 nova_compute[192698]: 
Oct 01 13:53:01 compute-0 nova_compute[192698]: </capabilities>
Oct 01 13:53:01 compute-0 nova_compute[192698]: 
Oct 01 13:53:01 compute-0 nova_compute[192698]: 2025-10-01 13:53:01.292 2 DEBUG nova.virt.libvirt.host [None req-d41344e6-afd1-4552-bc93-1f06ca4fddfe - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:944
Oct 01 13:53:01 compute-0 nova_compute[192698]: 2025-10-01 13:53:01.329 2 DEBUG nova.virt.libvirt.host [None req-d41344e6-afd1-4552-bc93-1f06ca4fddfe - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Oct 01 13:53:01 compute-0 nova_compute[192698]: <domainCapabilities>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   <path>/usr/libexec/qemu-kvm</path>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   <domain>kvm</domain>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   <machine>pc-i440fx-rhel7.6.0</machine>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   <arch>i686</arch>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   <vcpu max='240'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   <iothreads supported='yes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   <os supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <enum name='firmware'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <loader supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='type'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>rom</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>pflash</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='readonly'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>yes</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>no</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='secure'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>no</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </loader>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   </os>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   <cpu>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <mode name='host-passthrough' supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='hostPassthroughMigratable'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>on</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>off</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </mode>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <mode name='maximum' supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='maximumMigratable'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>on</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>off</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </mode>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <mode name='host-model' supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model fallback='forbid'>EPYC-Rome</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <vendor>AMD</vendor>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <maxphysaddr mode='passthrough' limit='40'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='x2apic'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='tsc-deadline'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='hypervisor'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='tsc_adjust'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='spec-ctrl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='stibp'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='arch-capabilities'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='ssbd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='cmp_legacy'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='overflow-recov'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='succor'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='ibrs'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='amd-ssbd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='virt-ssbd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='lbrv'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='tsc-scale'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='vmcb-clean'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='flushbyasid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='pause-filter'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='pfthreshold'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='svme-addr-chk'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='lfence-always-serializing'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='rdctl-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='mds-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='pschange-mc-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='gds-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='rfds-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='disable' name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </mode>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <mode name='custom' supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Broadwell'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Broadwell-IBRS'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Broadwell-noTSX'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Broadwell-noTSX-IBRS'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Broadwell-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Broadwell-v2'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Broadwell-v3'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Broadwell-v4'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Cascadelake-Server'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Cascadelake-Server-noTSX'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Cascadelake-Server-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Cascadelake-Server-v2'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Cascadelake-Server-v3'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Cascadelake-Server-v4'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Cascadelake-Server-v5'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Cooperlake'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='taa-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Cooperlake-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='taa-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Cooperlake-v2'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='taa-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Denverton'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='mpx'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Denverton-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='mpx'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Denverton-v2'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Denverton-v3'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Dhyana-v2'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='EPYC-Genoa'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amd-psfd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='auto-ibrs'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512ifma'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='no-nested-data-bp'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='null-sel-clr-base'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='stibp-always-on'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='EPYC-Genoa-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amd-psfd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='auto-ibrs'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512ifma'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='no-nested-data-bp'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='null-sel-clr-base'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='stibp-always-on'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='EPYC-Milan'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='EPYC-Milan-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='EPYC-Milan-v2'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amd-psfd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='no-nested-data-bp'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='null-sel-clr-base'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='stibp-always-on'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='EPYC-Rome'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='EPYC-Rome-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='EPYC-Rome-v2'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='EPYC-Rome-v3'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='EPYC-v3'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='EPYC-v4'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='GraniteRapids'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-fp16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-int8'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-tile'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx-vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-fp16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512ifma'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='bus-lock-detect'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fbsdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrc'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrs'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fzrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='mcdt-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pbrsb-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='prefetchiti'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='psdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='sbdr-ssdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='serialize'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='taa-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='tsx-ldtrk'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xfd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='GraniteRapids-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-fp16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-int8'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-tile'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx-vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-fp16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512ifma'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='bus-lock-detect'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fbsdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrc'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrs'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fzrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='mcdt-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pbrsb-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='prefetchiti'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='psdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='sbdr-ssdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='serialize'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='taa-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='tsx-ldtrk'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xfd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='GraniteRapids-v2'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-fp16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-int8'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-tile'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx-vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx10'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx10-128'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx10-256'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx10-512'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-fp16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512ifma'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='bus-lock-detect'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='cldemote'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fbsdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrc'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrs'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fzrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='mcdt-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='movdir64b'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='movdiri'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pbrsb-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='prefetchiti'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='psdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='sbdr-ssdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='serialize'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ss'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='taa-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='tsx-ldtrk'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xfd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Haswell'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Haswell-IBRS'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Haswell-noTSX'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Haswell-noTSX-IBRS'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Haswell-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Haswell-v2'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Haswell-v3'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Haswell-v4'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Icelake-Server'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Icelake-Server-noTSX'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Icelake-Server-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Icelake-Server-v2'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Icelake-Server-v3'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='taa-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Icelake-Server-v4'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512ifma'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='taa-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Icelake-Server-v5'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512ifma'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='taa-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Icelake-Server-v6'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512ifma'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='taa-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Icelake-Server-v7'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512ifma'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='taa-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='IvyBridge'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='IvyBridge-IBRS'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='IvyBridge-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='IvyBridge-v2'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='KnightsMill'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-4fmaps'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-4vnniw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512er'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512pf'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ss'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='KnightsMill-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-4fmaps'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-4vnniw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512er'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512pf'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ss'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Opteron_G4'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fma4'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xop'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Opteron_G4-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fma4'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xop'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Opteron_G5'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fma4'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='tbm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xop'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Opteron_G5-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fma4'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='tbm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xop'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='SapphireRapids'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-int8'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-tile'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx-vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-fp16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512ifma'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='bus-lock-detect'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrc'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrs'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fzrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='serialize'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='taa-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='tsx-ldtrk'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xfd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='SapphireRapids-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-int8'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-tile'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx-vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-fp16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512ifma'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='bus-lock-detect'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrc'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrs'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fzrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='serialize'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='taa-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='tsx-ldtrk'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xfd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='SapphireRapids-v2'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-int8'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-tile'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx-vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-fp16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512ifma'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='bus-lock-detect'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fbsdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrc'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrs'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fzrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='psdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='sbdr-ssdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='serialize'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='taa-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='tsx-ldtrk'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xfd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='SapphireRapids-v3'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-int8'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-tile'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx-vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-fp16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512ifma'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='bus-lock-detect'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='cldemote'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fbsdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrc'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrs'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fzrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='movdir64b'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='movdiri'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='psdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='sbdr-ssdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='serialize'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ss'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='taa-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='tsx-ldtrk'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xfd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='SierraForest'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx-ifma'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx-ne-convert'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx-vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx-vnni-int8'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='bus-lock-detect'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='cmpccxadd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fbsdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrs'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='mcdt-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pbrsb-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='psdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='sbdr-ssdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='serialize'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='SierraForest-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx-ifma'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx-ne-convert'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx-vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx-vnni-int8'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='bus-lock-detect'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='cmpccxadd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fbsdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrs'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='mcdt-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pbrsb-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='psdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='sbdr-ssdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='serialize'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Skylake-Client'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Skylake-Client-IBRS'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Skylake-Client-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Skylake-Client-v2'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Skylake-Client-v3'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Skylake-Client-v4'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Skylake-Server'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Skylake-Server-IBRS'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Skylake-Server-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Skylake-Server-v2'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Skylake-Server-v3'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Skylake-Server-v4'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Skylake-Server-v5'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Snowridge'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='cldemote'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='core-capability'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='movdir64b'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='movdiri'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='mpx'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='split-lock-detect'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Snowridge-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='cldemote'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='core-capability'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='movdir64b'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='movdiri'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='mpx'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='split-lock-detect'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Snowridge-v2'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='cldemote'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='core-capability'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='movdir64b'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='movdiri'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='split-lock-detect'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Snowridge-v3'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='cldemote'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='core-capability'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='movdir64b'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='movdiri'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='split-lock-detect'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Snowridge-v4'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='cldemote'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='movdir64b'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='movdiri'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='athlon'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='3dnow'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='3dnowext'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='athlon-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='3dnow'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='3dnowext'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='core2duo'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ss'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='core2duo-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ss'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='coreduo'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ss'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='coreduo-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ss'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='n270'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ss'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='n270-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ss'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='phenom'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='3dnow'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='3dnowext'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='phenom-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='3dnow'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='3dnowext'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </mode>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   </cpu>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   <memoryBacking supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <enum name='sourceType'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <value>file</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <value>anonymous</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <value>memfd</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   </memoryBacking>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   <devices>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <disk supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='diskDevice'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>disk</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>cdrom</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>floppy</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>lun</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='bus'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>ide</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>fdc</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>scsi</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>virtio</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>usb</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>sata</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='model'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>virtio</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>virtio-transitional</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>virtio-non-transitional</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </disk>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <graphics supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='type'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>vnc</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>egl-headless</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>dbus</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </graphics>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <video supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='modelType'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>vga</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>cirrus</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>virtio</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>none</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>bochs</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>ramfb</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </video>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <hostdev supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='mode'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>subsystem</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='startupPolicy'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>default</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>mandatory</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>requisite</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>optional</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='subsysType'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>usb</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>pci</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>scsi</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='capsType'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='pciBackend'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </hostdev>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <rng supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='model'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>virtio</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>virtio-transitional</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>virtio-non-transitional</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='backendModel'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>random</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>egd</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>builtin</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </rng>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <filesystem supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='driverType'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>path</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>handle</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>virtiofs</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </filesystem>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <tpm supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='model'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>tpm-tis</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>tpm-crb</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='backendModel'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>emulator</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>external</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='backendVersion'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>2.0</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </tpm>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <redirdev supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='bus'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>usb</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </redirdev>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <channel supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='type'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>pty</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>unix</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </channel>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <crypto supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='model'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='type'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>qemu</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='backendModel'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>builtin</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </crypto>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <interface supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='backendType'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>default</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>passt</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </interface>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <panic supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='model'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>isa</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>hyperv</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </panic>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   </devices>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   <features>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <gic supported='no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <vmcoreinfo supported='yes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <genid supported='yes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <backingStoreInput supported='yes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <backup supported='yes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <async-teardown supported='yes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <ps2 supported='yes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <sev supported='no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <sgx supported='no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <hyperv supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='features'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>relaxed</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>vapic</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>spinlocks</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>vpindex</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>runtime</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>synic</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>stimer</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>reset</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>vendor_id</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>frequencies</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>reenlightenment</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>tlbflush</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>ipi</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>avic</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>emsr_bitmap</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>xmm_input</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </hyperv>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <launchSecurity supported='no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   </features>
Oct 01 13:53:01 compute-0 nova_compute[192698]: </domainCapabilities>
Oct 01 13:53:01 compute-0 nova_compute[192698]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Oct 01 13:53:01 compute-0 nova_compute[192698]: 2025-10-01 13:53:01.341 2 DEBUG nova.virt.libvirt.host [None req-d41344e6-afd1-4552-bc93-1f06ca4fddfe - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Oct 01 13:53:01 compute-0 nova_compute[192698]: <domainCapabilities>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   <path>/usr/libexec/qemu-kvm</path>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   <domain>kvm</domain>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   <machine>pc-q35-rhel9.6.0</machine>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   <arch>i686</arch>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   <vcpu max='4096'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   <iothreads supported='yes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   <os supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <enum name='firmware'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <loader supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='type'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>rom</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>pflash</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='readonly'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>yes</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>no</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='secure'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>no</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </loader>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   </os>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   <cpu>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <mode name='host-passthrough' supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='hostPassthroughMigratable'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>on</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>off</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </mode>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <mode name='maximum' supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='maximumMigratable'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>on</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>off</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </mode>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <mode name='host-model' supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model fallback='forbid'>EPYC-Rome</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <vendor>AMD</vendor>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <maxphysaddr mode='passthrough' limit='40'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='x2apic'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='tsc-deadline'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='hypervisor'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='tsc_adjust'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='spec-ctrl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='stibp'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='arch-capabilities'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='ssbd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='cmp_legacy'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='overflow-recov'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='succor'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='ibrs'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='amd-ssbd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='virt-ssbd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='lbrv'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='tsc-scale'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='vmcb-clean'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='flushbyasid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='pause-filter'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='pfthreshold'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='svme-addr-chk'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='lfence-always-serializing'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='rdctl-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='mds-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='pschange-mc-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='gds-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='rfds-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='disable' name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </mode>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <mode name='custom' supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Broadwell'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Broadwell-IBRS'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Broadwell-noTSX'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Broadwell-noTSX-IBRS'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Broadwell-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Broadwell-v2'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Broadwell-v3'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Broadwell-v4'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Cascadelake-Server'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Cascadelake-Server-noTSX'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Cascadelake-Server-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Cascadelake-Server-v2'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Cascadelake-Server-v3'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Cascadelake-Server-v4'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Cascadelake-Server-v5'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Cooperlake'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='taa-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Cooperlake-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='taa-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Cooperlake-v2'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='taa-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Denverton'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='mpx'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Denverton-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='mpx'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Denverton-v2'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Denverton-v3'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Dhyana-v2'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='EPYC-Genoa'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amd-psfd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='auto-ibrs'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512ifma'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='no-nested-data-bp'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='null-sel-clr-base'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='stibp-always-on'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='EPYC-Genoa-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amd-psfd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='auto-ibrs'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512ifma'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='no-nested-data-bp'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='null-sel-clr-base'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='stibp-always-on'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='EPYC-Milan'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='EPYC-Milan-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='EPYC-Milan-v2'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amd-psfd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='no-nested-data-bp'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='null-sel-clr-base'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='stibp-always-on'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='EPYC-Rome'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='EPYC-Rome-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='EPYC-Rome-v2'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='EPYC-Rome-v3'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='EPYC-v3'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='EPYC-v4'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='GraniteRapids'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-fp16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-int8'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-tile'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx-vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-fp16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512ifma'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='bus-lock-detect'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fbsdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrc'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrs'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fzrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='mcdt-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pbrsb-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='prefetchiti'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='psdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='sbdr-ssdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='serialize'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='taa-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='tsx-ldtrk'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xfd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='GraniteRapids-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-fp16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-int8'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-tile'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx-vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-fp16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512ifma'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='bus-lock-detect'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fbsdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrc'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrs'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fzrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='mcdt-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pbrsb-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='prefetchiti'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='psdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='sbdr-ssdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='serialize'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='taa-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='tsx-ldtrk'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xfd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='GraniteRapids-v2'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-fp16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-int8'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-tile'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx-vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx10'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx10-128'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx10-256'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx10-512'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-fp16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512ifma'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='bus-lock-detect'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='cldemote'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fbsdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrc'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrs'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fzrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='mcdt-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='movdir64b'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='movdiri'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pbrsb-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='prefetchiti'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='psdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='sbdr-ssdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='serialize'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ss'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='taa-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='tsx-ldtrk'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xfd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Haswell'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Haswell-IBRS'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Haswell-noTSX'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Haswell-noTSX-IBRS'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Haswell-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Haswell-v2'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Haswell-v3'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Haswell-v4'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Icelake-Server'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Icelake-Server-noTSX'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Icelake-Server-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Icelake-Server-v2'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Icelake-Server-v3'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='taa-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Icelake-Server-v4'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512ifma'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='taa-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Icelake-Server-v5'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512ifma'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='taa-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Icelake-Server-v6'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512ifma'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='taa-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Icelake-Server-v7'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512ifma'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='taa-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='IvyBridge'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='IvyBridge-IBRS'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='IvyBridge-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='IvyBridge-v2'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='KnightsMill'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-4fmaps'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-4vnniw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512er'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512pf'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ss'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='KnightsMill-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-4fmaps'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-4vnniw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512er'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512pf'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ss'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Opteron_G4'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fma4'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xop'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Opteron_G4-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fma4'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xop'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Opteron_G5'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fma4'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='tbm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xop'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Opteron_G5-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fma4'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='tbm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xop'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='SapphireRapids'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-int8'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-tile'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx-vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-fp16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512ifma'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='bus-lock-detect'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrc'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrs'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fzrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='serialize'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='taa-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='tsx-ldtrk'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xfd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='SapphireRapids-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-int8'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-tile'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx-vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-fp16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512ifma'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='bus-lock-detect'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrc'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrs'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fzrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='serialize'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='taa-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='tsx-ldtrk'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xfd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='SapphireRapids-v2'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-int8'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-tile'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx-vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-fp16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512ifma'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='bus-lock-detect'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fbsdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrc'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrs'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fzrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='psdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='sbdr-ssdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='serialize'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='taa-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='tsx-ldtrk'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xfd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='SapphireRapids-v3'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-int8'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-tile'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx-vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-fp16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512ifma'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='bus-lock-detect'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='cldemote'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fbsdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrc'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrs'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fzrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='movdir64b'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='movdiri'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='psdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='sbdr-ssdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='serialize'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ss'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='taa-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='tsx-ldtrk'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xfd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='SierraForest'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx-ifma'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx-ne-convert'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx-vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx-vnni-int8'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='bus-lock-detect'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='cmpccxadd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fbsdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrs'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='mcdt-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pbrsb-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='psdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='sbdr-ssdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='serialize'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='SierraForest-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx-ifma'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx-ne-convert'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx-vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx-vnni-int8'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='bus-lock-detect'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='cmpccxadd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fbsdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrs'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='mcdt-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pbrsb-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='psdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='sbdr-ssdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='serialize'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Skylake-Client'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Skylake-Client-IBRS'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Skylake-Client-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Skylake-Client-v2'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Skylake-Client-v3'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Skylake-Client-v4'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Skylake-Server'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Skylake-Server-IBRS'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Skylake-Server-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Skylake-Server-v2'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Skylake-Server-v3'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Skylake-Server-v4'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Skylake-Server-v5'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Snowridge'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='cldemote'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='core-capability'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='movdir64b'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='movdiri'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='mpx'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='split-lock-detect'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Snowridge-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='cldemote'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='core-capability'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='movdir64b'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='movdiri'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='mpx'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='split-lock-detect'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Snowridge-v2'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='cldemote'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='core-capability'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='movdir64b'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='movdiri'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='split-lock-detect'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Snowridge-v3'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='cldemote'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='core-capability'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='movdir64b'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='movdiri'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='split-lock-detect'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Snowridge-v4'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='cldemote'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='movdir64b'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='movdiri'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='athlon'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='3dnow'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='3dnowext'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='athlon-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='3dnow'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='3dnowext'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='core2duo'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ss'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='core2duo-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ss'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='coreduo'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ss'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='coreduo-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ss'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='n270'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ss'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='n270-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ss'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='phenom'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='3dnow'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='3dnowext'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='phenom-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='3dnow'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='3dnowext'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </mode>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   </cpu>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   <memoryBacking supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <enum name='sourceType'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <value>file</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <value>anonymous</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <value>memfd</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   </memoryBacking>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   <devices>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <disk supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='diskDevice'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>disk</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>cdrom</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>floppy</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>lun</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='bus'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>fdc</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>scsi</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>virtio</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>usb</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>sata</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='model'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>virtio</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>virtio-transitional</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>virtio-non-transitional</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </disk>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <graphics supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='type'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>vnc</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>egl-headless</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>dbus</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </graphics>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <video supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='modelType'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>vga</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>cirrus</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>virtio</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>none</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>bochs</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>ramfb</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </video>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <hostdev supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='mode'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>subsystem</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='startupPolicy'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>default</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>mandatory</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>requisite</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>optional</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='subsysType'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>usb</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>pci</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>scsi</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='capsType'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='pciBackend'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </hostdev>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <rng supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='model'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>virtio</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>virtio-transitional</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>virtio-non-transitional</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='backendModel'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>random</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>egd</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>builtin</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </rng>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <filesystem supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='driverType'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>path</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>handle</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>virtiofs</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </filesystem>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <tpm supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='model'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>tpm-tis</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>tpm-crb</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='backendModel'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>emulator</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>external</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='backendVersion'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>2.0</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </tpm>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <redirdev supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='bus'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>usb</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </redirdev>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <channel supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='type'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>pty</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>unix</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </channel>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <crypto supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='model'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='type'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>qemu</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='backendModel'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>builtin</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </crypto>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <interface supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='backendType'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>default</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>passt</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </interface>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <panic supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='model'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>isa</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>hyperv</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </panic>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   </devices>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   <features>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <gic supported='no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <vmcoreinfo supported='yes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <genid supported='yes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <backingStoreInput supported='yes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <backup supported='yes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <async-teardown supported='yes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <ps2 supported='yes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <sev supported='no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <sgx supported='no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <hyperv supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='features'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>relaxed</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>vapic</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>spinlocks</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>vpindex</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>runtime</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>synic</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>stimer</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>reset</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>vendor_id</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>frequencies</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>reenlightenment</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>tlbflush</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>ipi</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>avic</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>emsr_bitmap</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>xmm_input</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </hyperv>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <launchSecurity supported='no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   </features>
Oct 01 13:53:01 compute-0 nova_compute[192698]: </domainCapabilities>
Oct 01 13:53:01 compute-0 nova_compute[192698]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Oct 01 13:53:01 compute-0 nova_compute[192698]: 2025-10-01 13:53:01.389 2 DEBUG nova.virt.libvirt.host [None req-d41344e6-afd1-4552-bc93-1f06ca4fddfe - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:944
Oct 01 13:53:01 compute-0 nova_compute[192698]: 2025-10-01 13:53:01.396 2 DEBUG nova.virt.libvirt.host [None req-d41344e6-afd1-4552-bc93-1f06ca4fddfe - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Oct 01 13:53:01 compute-0 nova_compute[192698]: <domainCapabilities>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   <path>/usr/libexec/qemu-kvm</path>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   <domain>kvm</domain>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   <machine>pc-i440fx-rhel7.6.0</machine>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   <arch>x86_64</arch>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   <vcpu max='240'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   <iothreads supported='yes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   <os supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <enum name='firmware'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <loader supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='type'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>rom</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>pflash</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='readonly'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>yes</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>no</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='secure'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>no</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </loader>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   </os>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   <cpu>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <mode name='host-passthrough' supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='hostPassthroughMigratable'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>on</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>off</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </mode>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <mode name='maximum' supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='maximumMigratable'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>on</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>off</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </mode>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <mode name='host-model' supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model fallback='forbid'>EPYC-Rome</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <vendor>AMD</vendor>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <maxphysaddr mode='passthrough' limit='40'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='x2apic'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='tsc-deadline'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='hypervisor'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='tsc_adjust'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='spec-ctrl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='stibp'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='arch-capabilities'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='ssbd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='cmp_legacy'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='overflow-recov'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='succor'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='ibrs'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='amd-ssbd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='virt-ssbd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='lbrv'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='tsc-scale'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='vmcb-clean'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='flushbyasid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='pause-filter'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='pfthreshold'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='svme-addr-chk'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='lfence-always-serializing'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='rdctl-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='mds-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='pschange-mc-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='gds-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='rfds-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='disable' name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </mode>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <mode name='custom' supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Broadwell'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Broadwell-IBRS'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Broadwell-noTSX'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Broadwell-noTSX-IBRS'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Broadwell-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Broadwell-v2'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Broadwell-v3'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Broadwell-v4'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Cascadelake-Server'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Cascadelake-Server-noTSX'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Cascadelake-Server-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Cascadelake-Server-v2'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Cascadelake-Server-v3'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Cascadelake-Server-v4'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Cascadelake-Server-v5'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Cooperlake'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='taa-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Cooperlake-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='taa-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Cooperlake-v2'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='taa-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Denverton'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='mpx'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Denverton-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='mpx'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Denverton-v2'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Denverton-v3'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Dhyana-v2'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='EPYC-Genoa'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amd-psfd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='auto-ibrs'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512ifma'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='no-nested-data-bp'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='null-sel-clr-base'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='stibp-always-on'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='EPYC-Genoa-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amd-psfd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='auto-ibrs'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512ifma'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='no-nested-data-bp'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='null-sel-clr-base'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='stibp-always-on'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='EPYC-Milan'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='EPYC-Milan-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='EPYC-Milan-v2'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amd-psfd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='no-nested-data-bp'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='null-sel-clr-base'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='stibp-always-on'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='EPYC-Rome'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='EPYC-Rome-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='EPYC-Rome-v2'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='EPYC-Rome-v3'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='EPYC-v3'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='EPYC-v4'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='GraniteRapids'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-fp16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-int8'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-tile'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx-vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-fp16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512ifma'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='bus-lock-detect'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fbsdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrc'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrs'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fzrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='mcdt-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pbrsb-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='prefetchiti'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='psdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='sbdr-ssdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='serialize'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='taa-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='tsx-ldtrk'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xfd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='GraniteRapids-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-fp16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-int8'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-tile'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx-vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-fp16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512ifma'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='bus-lock-detect'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fbsdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrc'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrs'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fzrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='mcdt-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pbrsb-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='prefetchiti'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='psdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='sbdr-ssdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='serialize'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='taa-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='tsx-ldtrk'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xfd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='GraniteRapids-v2'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-fp16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-int8'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-tile'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx-vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx10'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx10-128'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx10-256'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx10-512'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-fp16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512ifma'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='bus-lock-detect'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='cldemote'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fbsdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrc'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrs'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fzrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='mcdt-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='movdir64b'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='movdiri'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pbrsb-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='prefetchiti'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='psdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='sbdr-ssdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='serialize'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ss'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='taa-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='tsx-ldtrk'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xfd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Haswell'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Haswell-IBRS'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Haswell-noTSX'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Haswell-noTSX-IBRS'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Haswell-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Haswell-v2'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Haswell-v3'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Haswell-v4'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Icelake-Server'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Icelake-Server-noTSX'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Icelake-Server-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Icelake-Server-v2'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Icelake-Server-v3'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='taa-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Icelake-Server-v4'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512ifma'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='taa-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Icelake-Server-v5'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512ifma'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='taa-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Icelake-Server-v6'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512ifma'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='taa-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Icelake-Server-v7'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512ifma'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='taa-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='IvyBridge'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='IvyBridge-IBRS'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='IvyBridge-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='IvyBridge-v2'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='KnightsMill'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-4fmaps'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-4vnniw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512er'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512pf'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ss'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='KnightsMill-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-4fmaps'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-4vnniw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512er'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512pf'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ss'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Opteron_G4'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fma4'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xop'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Opteron_G4-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fma4'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xop'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Opteron_G5'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fma4'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='tbm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xop'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Opteron_G5-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fma4'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='tbm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xop'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='SapphireRapids'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-int8'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-tile'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx-vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-fp16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512ifma'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='bus-lock-detect'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrc'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrs'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fzrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='serialize'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='taa-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='tsx-ldtrk'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xfd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='SapphireRapids-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-int8'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-tile'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx-vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-fp16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512ifma'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='bus-lock-detect'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrc'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrs'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fzrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='serialize'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='taa-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='tsx-ldtrk'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xfd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='SapphireRapids-v2'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-int8'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-tile'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx-vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-fp16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512ifma'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='bus-lock-detect'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fbsdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrc'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrs'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fzrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='psdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='sbdr-ssdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='serialize'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='taa-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='tsx-ldtrk'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xfd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='SapphireRapids-v3'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-int8'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-tile'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx-vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-fp16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512ifma'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='bus-lock-detect'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='cldemote'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fbsdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrc'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrs'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fzrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='movdir64b'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='movdiri'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='psdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='sbdr-ssdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='serialize'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ss'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='taa-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='tsx-ldtrk'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xfd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='SierraForest'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx-ifma'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx-ne-convert'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx-vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx-vnni-int8'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='bus-lock-detect'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='cmpccxadd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fbsdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrs'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='mcdt-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pbrsb-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='psdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='sbdr-ssdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='serialize'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='SierraForest-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx-ifma'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx-ne-convert'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx-vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx-vnni-int8'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='bus-lock-detect'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='cmpccxadd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fbsdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrs'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='mcdt-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pbrsb-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='psdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='sbdr-ssdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='serialize'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Skylake-Client'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Skylake-Client-IBRS'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Skylake-Client-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Skylake-Client-v2'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Skylake-Client-v3'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Skylake-Client-v4'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Skylake-Server'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Skylake-Server-IBRS'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Skylake-Server-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Skylake-Server-v2'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Skylake-Server-v3'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Skylake-Server-v4'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Skylake-Server-v5'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Snowridge'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='cldemote'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='core-capability'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='movdir64b'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='movdiri'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='mpx'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='split-lock-detect'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Snowridge-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='cldemote'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='core-capability'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='movdir64b'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='movdiri'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='mpx'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='split-lock-detect'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Snowridge-v2'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='cldemote'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='core-capability'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='movdir64b'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='movdiri'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='split-lock-detect'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Snowridge-v3'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='cldemote'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='core-capability'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='movdir64b'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='movdiri'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='split-lock-detect'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Snowridge-v4'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='cldemote'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='movdir64b'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='movdiri'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='athlon'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='3dnow'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='3dnowext'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='athlon-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='3dnow'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='3dnowext'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='core2duo'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ss'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='core2duo-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ss'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='coreduo'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ss'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='coreduo-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ss'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='n270'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ss'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='n270-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ss'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='phenom'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='3dnow'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='3dnowext'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='phenom-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='3dnow'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='3dnowext'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </mode>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   </cpu>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   <memoryBacking supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <enum name='sourceType'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <value>file</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <value>anonymous</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <value>memfd</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   </memoryBacking>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   <devices>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <disk supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='diskDevice'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>disk</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>cdrom</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>floppy</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>lun</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='bus'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>ide</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>fdc</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>scsi</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>virtio</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>usb</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>sata</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='model'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>virtio</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>virtio-transitional</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>virtio-non-transitional</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </disk>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <graphics supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='type'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>vnc</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>egl-headless</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>dbus</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </graphics>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <video supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='modelType'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>vga</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>cirrus</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>virtio</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>none</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>bochs</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>ramfb</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </video>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <hostdev supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='mode'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>subsystem</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='startupPolicy'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>default</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>mandatory</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>requisite</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>optional</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='subsysType'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>usb</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>pci</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>scsi</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='capsType'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='pciBackend'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </hostdev>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <rng supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='model'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>virtio</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>virtio-transitional</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>virtio-non-transitional</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='backendModel'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>random</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>egd</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>builtin</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </rng>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <filesystem supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='driverType'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>path</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>handle</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>virtiofs</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </filesystem>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <tpm supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='model'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>tpm-tis</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>tpm-crb</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='backendModel'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>emulator</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>external</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='backendVersion'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>2.0</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </tpm>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <redirdev supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='bus'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>usb</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </redirdev>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <channel supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='type'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>pty</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>unix</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </channel>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <crypto supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='model'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='type'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>qemu</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='backendModel'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>builtin</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </crypto>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <interface supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='backendType'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>default</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>passt</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </interface>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <panic supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='model'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>isa</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>hyperv</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </panic>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   </devices>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   <features>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <gic supported='no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <vmcoreinfo supported='yes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <genid supported='yes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <backingStoreInput supported='yes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <backup supported='yes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <async-teardown supported='yes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <ps2 supported='yes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <sev supported='no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <sgx supported='no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <hyperv supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='features'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>relaxed</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>vapic</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>spinlocks</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>vpindex</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>runtime</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>synic</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>stimer</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>reset</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>vendor_id</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>frequencies</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>reenlightenment</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>tlbflush</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>ipi</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>avic</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>emsr_bitmap</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>xmm_input</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </hyperv>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <launchSecurity supported='no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   </features>
Oct 01 13:53:01 compute-0 nova_compute[192698]: </domainCapabilities>
Oct 01 13:53:01 compute-0 nova_compute[192698]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Oct 01 13:53:01 compute-0 nova_compute[192698]: 2025-10-01 13:53:01.461 2 DEBUG nova.virt.libvirt.host [None req-d41344e6-afd1-4552-bc93-1f06ca4fddfe - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Oct 01 13:53:01 compute-0 nova_compute[192698]: <domainCapabilities>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   <path>/usr/libexec/qemu-kvm</path>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   <domain>kvm</domain>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   <machine>pc-q35-rhel9.6.0</machine>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   <arch>x86_64</arch>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   <vcpu max='4096'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   <iothreads supported='yes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   <os supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <enum name='firmware'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <value>efi</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <loader supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='type'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>rom</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>pflash</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='readonly'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>yes</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>no</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='secure'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>yes</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>no</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </loader>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   </os>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   <cpu>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <mode name='host-passthrough' supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='hostPassthroughMigratable'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>on</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>off</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </mode>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <mode name='maximum' supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='maximumMigratable'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>on</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>off</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </mode>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <mode name='host-model' supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model fallback='forbid'>EPYC-Rome</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <vendor>AMD</vendor>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <maxphysaddr mode='passthrough' limit='40'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='x2apic'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='tsc-deadline'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='hypervisor'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='tsc_adjust'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='spec-ctrl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='stibp'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='arch-capabilities'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='ssbd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='cmp_legacy'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='overflow-recov'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='succor'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='ibrs'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='amd-ssbd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='virt-ssbd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='lbrv'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='tsc-scale'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='vmcb-clean'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='flushbyasid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='pause-filter'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='pfthreshold'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='svme-addr-chk'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='lfence-always-serializing'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='rdctl-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='mds-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='pschange-mc-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='gds-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='require' name='rfds-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <feature policy='disable' name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </mode>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <mode name='custom' supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Broadwell'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Broadwell-IBRS'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Broadwell-noTSX'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Broadwell-noTSX-IBRS'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Broadwell-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Broadwell-v2'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Broadwell-v3'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Broadwell-v4'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Cascadelake-Server'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Cascadelake-Server-noTSX'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Cascadelake-Server-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Cascadelake-Server-v2'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Cascadelake-Server-v3'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Cascadelake-Server-v4'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Cascadelake-Server-v5'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Cooperlake'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='taa-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Cooperlake-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='taa-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Cooperlake-v2'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='taa-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Denverton'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='mpx'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Denverton-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='mpx'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Denverton-v2'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Denverton-v3'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Dhyana-v2'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='EPYC-Genoa'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amd-psfd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='auto-ibrs'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512ifma'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='no-nested-data-bp'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='null-sel-clr-base'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='stibp-always-on'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='EPYC-Genoa-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amd-psfd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='auto-ibrs'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512ifma'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='no-nested-data-bp'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='null-sel-clr-base'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='stibp-always-on'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='EPYC-Milan'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='EPYC-Milan-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='EPYC-Milan-v2'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amd-psfd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='no-nested-data-bp'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='null-sel-clr-base'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='stibp-always-on'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='EPYC-Rome'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='EPYC-Rome-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='EPYC-Rome-v2'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='EPYC-Rome-v3'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='EPYC-v3'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='EPYC-v4'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='GraniteRapids'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-fp16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-int8'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-tile'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx-vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-fp16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512ifma'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='bus-lock-detect'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fbsdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrc'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrs'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fzrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='mcdt-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pbrsb-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='prefetchiti'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='psdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='sbdr-ssdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='serialize'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='taa-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='tsx-ldtrk'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xfd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='GraniteRapids-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-fp16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-int8'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-tile'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx-vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-fp16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512ifma'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='bus-lock-detect'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fbsdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrc'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrs'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fzrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='mcdt-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pbrsb-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='prefetchiti'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='psdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='sbdr-ssdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='serialize'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='taa-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='tsx-ldtrk'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xfd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='GraniteRapids-v2'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-fp16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-int8'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-tile'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx-vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx10'/>
Oct 01 13:53:01 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx10-128'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx10-256'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx10-512'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-fp16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512ifma'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='bus-lock-detect'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='cldemote'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fbsdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrc'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrs'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fzrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='mcdt-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='movdir64b'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='movdiri'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pbrsb-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='prefetchiti'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='psdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='sbdr-ssdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='serialize'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ss'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='taa-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='tsx-ldtrk'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xfd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Haswell'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Haswell-IBRS'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Haswell-noTSX'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Haswell-noTSX-IBRS'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Haswell-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Haswell-v2'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Haswell-v3'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Haswell-v4'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Icelake-Server'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Icelake-Server-noTSX'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Icelake-Server-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Icelake-Server-v2'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Icelake-Server-v3'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='taa-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Icelake-Server-v4'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512ifma'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='taa-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Icelake-Server-v5'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512ifma'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='taa-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Icelake-Server-v6'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512ifma'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='taa-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Icelake-Server-v7'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512ifma'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='taa-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='IvyBridge'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='IvyBridge-IBRS'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='IvyBridge-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='IvyBridge-v2'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='KnightsMill'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-4fmaps'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-4vnniw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512er'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512pf'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ss'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='KnightsMill-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-4fmaps'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-4vnniw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512er'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512pf'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ss'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Opteron_G4'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fma4'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xop'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Opteron_G4-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fma4'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xop'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Opteron_G5'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fma4'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='tbm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xop'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Opteron_G5-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fma4'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='tbm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xop'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='SapphireRapids'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-int8'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-tile'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx-vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-fp16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512ifma'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='bus-lock-detect'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrc'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrs'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fzrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='serialize'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='taa-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='tsx-ldtrk'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xfd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='SapphireRapids-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-int8'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-tile'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx-vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-fp16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512ifma'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='bus-lock-detect'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrc'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrs'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fzrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='serialize'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='taa-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='tsx-ldtrk'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xfd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='SapphireRapids-v2'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-int8'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-tile'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx-vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-fp16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512ifma'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='bus-lock-detect'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fbsdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrc'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrs'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fzrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='psdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='sbdr-ssdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='serialize'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='taa-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='tsx-ldtrk'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xfd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='SapphireRapids-v3'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-int8'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='amx-tile'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx-vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-bf16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-fp16'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512-vpopcntdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bitalg'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512ifma'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vbmi2'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='bus-lock-detect'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='cldemote'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fbsdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrc'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrs'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fzrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='la57'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='movdir64b'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='movdiri'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='psdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='sbdr-ssdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='serialize'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ss'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='taa-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='tsx-ldtrk'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xfd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='SierraForest'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx-ifma'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx-ne-convert'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx-vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx-vnni-int8'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='bus-lock-detect'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='cmpccxadd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fbsdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrs'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='mcdt-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pbrsb-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='psdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='sbdr-ssdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='serialize'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='SierraForest-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx-ifma'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx-ne-convert'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx-vnni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx-vnni-int8'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='bus-lock-detect'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='cmpccxadd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fbsdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='fsrs'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ibrs-all'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='mcdt-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pbrsb-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='psdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='sbdr-ssdp-no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='serialize'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vaes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='vpclmulqdq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Skylake-Client'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Skylake-Client-IBRS'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Skylake-Client-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Skylake-Client-v2'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Skylake-Client-v3'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Skylake-Client-v4'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Skylake-Server'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Skylake-Server-IBRS'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Skylake-Server-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Skylake-Server-v2'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='hle'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='rtm'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Skylake-Server-v3'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Skylake-Server-v4'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Skylake-Server-v5'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512bw'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512cd'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512dq'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512f'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='avx512vl'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='invpcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pcid'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='pku'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Snowridge'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='cldemote'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='core-capability'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='movdir64b'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='movdiri'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='mpx'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='split-lock-detect'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Snowridge-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='cldemote'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='core-capability'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='movdir64b'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='movdiri'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='mpx'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='split-lock-detect'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Snowridge-v2'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='cldemote'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='core-capability'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='movdir64b'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='movdiri'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='split-lock-detect'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Snowridge-v3'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='cldemote'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='core-capability'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='movdir64b'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='movdiri'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='split-lock-detect'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='Snowridge-v4'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='cldemote'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='erms'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='gfni'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='movdir64b'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='movdiri'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='xsaves'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='athlon'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='3dnow'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='3dnowext'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='athlon-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='3dnow'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='3dnowext'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='core2duo'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ss'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='core2duo-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ss'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='coreduo'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ss'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='coreduo-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ss'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='n270'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ss'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='n270-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='ss'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='phenom'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='3dnow'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='3dnowext'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <blockers model='phenom-v1'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='3dnow'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <feature name='3dnowext'/>
Oct 01 13:53:01 compute-0 systemd[1]: Started libvirt nodedev daemon.
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </blockers>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </mode>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   </cpu>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   <memoryBacking supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <enum name='sourceType'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <value>file</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <value>anonymous</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <value>memfd</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   </memoryBacking>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   <devices>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <disk supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='diskDevice'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>disk</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>cdrom</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>floppy</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>lun</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='bus'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>fdc</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>scsi</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>virtio</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>usb</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>sata</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='model'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>virtio</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>virtio-transitional</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>virtio-non-transitional</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </disk>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <graphics supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='type'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>vnc</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>egl-headless</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>dbus</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </graphics>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <video supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='modelType'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>vga</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>cirrus</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>virtio</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>none</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>bochs</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>ramfb</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </video>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <hostdev supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='mode'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>subsystem</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='startupPolicy'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>default</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>mandatory</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>requisite</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>optional</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='subsysType'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>usb</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>pci</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>scsi</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='capsType'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='pciBackend'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </hostdev>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <rng supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='model'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>virtio</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>virtio-transitional</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>virtio-non-transitional</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='backendModel'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>random</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>egd</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>builtin</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </rng>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <filesystem supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='driverType'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>path</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>handle</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>virtiofs</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </filesystem>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <tpm supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='model'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>tpm-tis</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>tpm-crb</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='backendModel'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>emulator</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>external</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='backendVersion'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>2.0</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </tpm>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <redirdev supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='bus'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>usb</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </redirdev>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <channel supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='type'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>pty</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>unix</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </channel>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <crypto supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='model'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='type'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>qemu</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='backendModel'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>builtin</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </crypto>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <interface supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='backendType'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>default</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>passt</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </interface>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <panic supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='model'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>isa</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>hyperv</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </panic>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   </devices>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   <features>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <gic supported='no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <vmcoreinfo supported='yes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <genid supported='yes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <backingStoreInput supported='yes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <backup supported='yes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <async-teardown supported='yes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <ps2 supported='yes'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <sev supported='no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <sgx supported='no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <hyperv supported='yes'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       <enum name='features'>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>relaxed</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>vapic</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>spinlocks</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>vpindex</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>runtime</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>synic</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>stimer</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>reset</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>vendor_id</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>frequencies</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>reenlightenment</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>tlbflush</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>ipi</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>avic</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>emsr_bitmap</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:         <value>xmm_input</value>
Oct 01 13:53:01 compute-0 nova_compute[192698]:       </enum>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     </hyperv>
Oct 01 13:53:01 compute-0 nova_compute[192698]:     <launchSecurity supported='no'/>
Oct 01 13:53:01 compute-0 nova_compute[192698]:   </features>
Oct 01 13:53:01 compute-0 nova_compute[192698]: </domainCapabilities>
Oct 01 13:53:01 compute-0 nova_compute[192698]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Oct 01 13:53:01 compute-0 nova_compute[192698]: 2025-10-01 13:53:01.549 2 DEBUG nova.virt.libvirt.host [None req-d41344e6-afd1-4552-bc93-1f06ca4fddfe - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1877
Oct 01 13:53:01 compute-0 nova_compute[192698]: 2025-10-01 13:53:01.550 2 DEBUG nova.virt.libvirt.host [None req-d41344e6-afd1-4552-bc93-1f06ca4fddfe - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1877
Oct 01 13:53:01 compute-0 nova_compute[192698]: 2025-10-01 13:53:01.550 2 DEBUG nova.virt.libvirt.host [None req-d41344e6-afd1-4552-bc93-1f06ca4fddfe - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1877
Oct 01 13:53:01 compute-0 nova_compute[192698]: 2025-10-01 13:53:01.550 2 INFO nova.virt.libvirt.host [None req-d41344e6-afd1-4552-bc93-1f06ca4fddfe - - - - - -] Secure Boot support detected
Oct 01 13:53:01 compute-0 nova_compute[192698]: 2025-10-01 13:53:01.558 2 INFO nova.virt.libvirt.driver [None req-d41344e6-afd1-4552-bc93-1f06ca4fddfe - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Oct 01 13:53:01 compute-0 nova_compute[192698]: 2025-10-01 13:53:01.558 2 INFO nova.virt.libvirt.driver [None req-d41344e6-afd1-4552-bc93-1f06ca4fddfe - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Oct 01 13:53:01 compute-0 anacron[1070]: Job `cron.weekly' started
Oct 01 13:53:01 compute-0 anacron[1070]: Job `cron.weekly' terminated
Oct 01 13:53:01 compute-0 nova_compute[192698]: 2025-10-01 13:53:01.745 2 DEBUG nova.virt.libvirt.driver [None req-d41344e6-afd1-4552-bc93-1f06ca4fddfe - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1177
Oct 01 13:53:01 compute-0 nova_compute[192698]: 2025-10-01 13:53:01.770 2 WARNING nova.virt.libvirt.driver [None req-d41344e6-afd1-4552-bc93-1f06ca4fddfe - - - - - -] Cannot update service status on host "compute-0.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Oct 01 13:53:01 compute-0 nova_compute[192698]: 2025-10-01 13:53:01.770 2 DEBUG nova.virt.libvirt.volume.mount [None req-d41344e6-afd1-4552-bc93-1f06ca4fddfe - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/mount.py:130
Oct 01 13:53:02 compute-0 nova_compute[192698]: 2025-10-01 13:53:02.261 2 INFO nova.virt.node [None req-d41344e6-afd1-4552-bc93-1f06ca4fddfe - - - - - -] Determined node identity ee1e54f5-453b-4949-a499-9a192f03b8f0 from /var/lib/nova/compute_id
Oct 01 13:53:02 compute-0 nova_compute[192698]: 2025-10-01 13:53:02.776 2 WARNING nova.compute.manager [None req-d41344e6-afd1-4552-bc93-1f06ca4fddfe - - - - - -] Compute nodes ['ee1e54f5-453b-4949-a499-9a192f03b8f0'] for host compute-0.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Oct 01 13:53:03 compute-0 nova_compute[192698]: 2025-10-01 13:53:03.791 2 INFO nova.compute.manager [None req-d41344e6-afd1-4552-bc93-1f06ca4fddfe - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Oct 01 13:53:04 compute-0 sshd-session[193019]: Accepted publickey for zuul from 192.168.122.30 port 47624 ssh2: ECDSA SHA256:G/wBH4NemtaB5A4Xrsc6R+GZmi6HC8VbviS/FKhdd8M
Oct 01 13:53:04 compute-0 systemd-logind[791]: New session 28 of user zuul.
Oct 01 13:53:04 compute-0 systemd[1]: Started Session 28 of User zuul.
Oct 01 13:53:04 compute-0 sshd-session[193019]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 01 13:53:04 compute-0 podman[193021]: 2025-10-01 13:53:04.299278432 +0000 UTC m=+0.086177984 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=iscsid, org.label-schema.license=GPLv2, tcib_managed=true, container_name=iscsid)
Oct 01 13:53:04 compute-0 podman[193023]: 2025-10-01 13:53:04.314729984 +0000 UTC m=+0.100116356 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Oct 01 13:53:04 compute-0 nova_compute[192698]: 2025-10-01 13:53:04.812 2 WARNING nova.compute.manager [None req-d41344e6-afd1-4552-bc93-1f06ca4fddfe - - - - - -] No compute node record found for host compute-0.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Oct 01 13:53:04 compute-0 nova_compute[192698]: 2025-10-01 13:53:04.813 2 DEBUG oslo_concurrency.lockutils [None req-d41344e6-afd1-4552-bc93-1f06ca4fddfe - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 13:53:04 compute-0 nova_compute[192698]: 2025-10-01 13:53:04.813 2 DEBUG oslo_concurrency.lockutils [None req-d41344e6-afd1-4552-bc93-1f06ca4fddfe - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 13:53:04 compute-0 nova_compute[192698]: 2025-10-01 13:53:04.814 2 DEBUG oslo_concurrency.lockutils [None req-d41344e6-afd1-4552-bc93-1f06ca4fddfe - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 13:53:04 compute-0 nova_compute[192698]: 2025-10-01 13:53:04.814 2 DEBUG nova.compute.resource_tracker [None req-d41344e6-afd1-4552-bc93-1f06ca4fddfe - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 01 13:53:05 compute-0 nova_compute[192698]: 2025-10-01 13:53:05.048 2 WARNING nova.virt.libvirt.driver [None req-d41344e6-afd1-4552-bc93-1f06ca4fddfe - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 13:53:05 compute-0 nova_compute[192698]: 2025-10-01 13:53:05.050 2 DEBUG oslo_concurrency.processutils [None req-d41344e6-afd1-4552-bc93-1f06ca4fddfe - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 13:53:05 compute-0 nova_compute[192698]: 2025-10-01 13:53:05.085 2 DEBUG oslo_concurrency.processutils [None req-d41344e6-afd1-4552-bc93-1f06ca4fddfe - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.035s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 13:53:05 compute-0 nova_compute[192698]: 2025-10-01 13:53:05.085 2 DEBUG nova.compute.resource_tracker [None req-d41344e6-afd1-4552-bc93-1f06ca4fddfe - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6189MB free_disk=73.5127182006836GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 01 13:53:05 compute-0 nova_compute[192698]: 2025-10-01 13:53:05.086 2 DEBUG oslo_concurrency.lockutils [None req-d41344e6-afd1-4552-bc93-1f06ca4fddfe - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 13:53:05 compute-0 nova_compute[192698]: 2025-10-01 13:53:05.086 2 DEBUG oslo_concurrency.lockutils [None req-d41344e6-afd1-4552-bc93-1f06ca4fddfe - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 13:53:05 compute-0 python3.9[193211]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct 01 13:53:05 compute-0 nova_compute[192698]: 2025-10-01 13:53:05.595 2 WARNING nova.compute.resource_tracker [None req-d41344e6-afd1-4552-bc93-1f06ca4fddfe - - - - - -] No compute node record for compute-0.ctlplane.example.com:ee1e54f5-453b-4949-a499-9a192f03b8f0: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host ee1e54f5-453b-4949-a499-9a192f03b8f0 could not be found.
Oct 01 13:53:06 compute-0 nova_compute[192698]: 2025-10-01 13:53:06.103 2 INFO nova.compute.resource_tracker [None req-d41344e6-afd1-4552-bc93-1f06ca4fddfe - - - - - -] Compute node record created for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com with uuid: ee1e54f5-453b-4949-a499-9a192f03b8f0
Oct 01 13:53:06 compute-0 sudo[193365]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mixtyorlpkactzoimqudfzoqmlbkzgcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326786.07063-52-184226314747422/AnsiballZ_systemd_service.py'
Oct 01 13:53:06 compute-0 sudo[193365]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:53:07 compute-0 python3.9[193367]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 01 13:53:07 compute-0 systemd[1]: Reloading.
Oct 01 13:53:07 compute-0 systemd-rc-local-generator[193396]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:53:07 compute-0 systemd-sysv-generator[193399]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:53:07 compute-0 sudo[193365]: pam_unix(sudo:session): session closed for user root
Oct 01 13:53:07 compute-0 nova_compute[192698]: 2025-10-01 13:53:07.630 2 DEBUG nova.compute.resource_tracker [None req-d41344e6-afd1-4552-bc93-1f06ca4fddfe - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 01 13:53:07 compute-0 nova_compute[192698]: 2025-10-01 13:53:07.631 2 DEBUG nova.compute.resource_tracker [None req-d41344e6-afd1-4552-bc93-1f06ca4fddfe - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 13:53:05 up 52 min,  0 user,  load average: 1.05, 0.91, 0.69\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 01 13:53:07 compute-0 nova_compute[192698]: 2025-10-01 13:53:07.744 2 INFO nova.scheduler.client.report [None req-d41344e6-afd1-4552-bc93-1f06ca4fddfe - - - - - -] [req-10d32043-8a93-4c4b-a17a-d081c4009f3d] Created resource provider record via placement API for resource provider with UUID ee1e54f5-453b-4949-a499-9a192f03b8f0 and name compute-0.ctlplane.example.com.
Oct 01 13:53:07 compute-0 nova_compute[192698]: 2025-10-01 13:53:07.805 2 DEBUG nova.virt.libvirt.host [None req-d41344e6-afd1-4552-bc93-1f06ca4fddfe - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Oct 01 13:53:07 compute-0 nova_compute[192698]: ] _kernel_supports_amd_sev /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1953
Oct 01 13:53:07 compute-0 nova_compute[192698]: 2025-10-01 13:53:07.805 2 INFO nova.virt.libvirt.host [None req-d41344e6-afd1-4552-bc93-1f06ca4fddfe - - - - - -] kernel doesn't support AMD SEV
Oct 01 13:53:07 compute-0 nova_compute[192698]: 2025-10-01 13:53:07.806 2 DEBUG nova.compute.provider_tree [None req-d41344e6-afd1-4552-bc93-1f06ca4fddfe - - - - - -] Updating inventory in ProviderTree for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Oct 01 13:53:07 compute-0 nova_compute[192698]: 2025-10-01 13:53:07.806 2 DEBUG nova.virt.libvirt.driver [None req-d41344e6-afd1-4552-bc93-1f06ca4fddfe - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Oct 01 13:53:08 compute-0 nova_compute[192698]: 2025-10-01 13:53:08.357 2 DEBUG nova.scheduler.client.report [None req-d41344e6-afd1-4552-bc93-1f06ca4fddfe - - - - - -] Updated inventory for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:975
Oct 01 13:53:08 compute-0 nova_compute[192698]: 2025-10-01 13:53:08.358 2 DEBUG nova.compute.provider_tree [None req-d41344e6-afd1-4552-bc93-1f06ca4fddfe - - - - - -] Updating resource provider ee1e54f5-453b-4949-a499-9a192f03b8f0 generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Oct 01 13:53:08 compute-0 nova_compute[192698]: 2025-10-01 13:53:08.358 2 DEBUG nova.compute.provider_tree [None req-d41344e6-afd1-4552-bc93-1f06ca4fddfe - - - - - -] Updating inventory in ProviderTree for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Oct 01 13:53:08 compute-0 python3.9[193553]: ansible-ansible.builtin.service_facts Invoked
Oct 01 13:53:08 compute-0 nova_compute[192698]: 2025-10-01 13:53:08.471 2 DEBUG nova.compute.provider_tree [None req-d41344e6-afd1-4552-bc93-1f06ca4fddfe - - - - - -] Updating resource provider ee1e54f5-453b-4949-a499-9a192f03b8f0 generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Oct 01 13:53:08 compute-0 network[193570]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct 01 13:53:08 compute-0 network[193571]: 'network-scripts' will be removed from distribution in near future.
Oct 01 13:53:08 compute-0 network[193572]: It is advised to switch to 'NetworkManager' instead for network management.
Oct 01 13:53:08 compute-0 nova_compute[192698]: 2025-10-01 13:53:08.981 2 DEBUG nova.compute.resource_tracker [None req-d41344e6-afd1-4552-bc93-1f06ca4fddfe - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 01 13:53:08 compute-0 nova_compute[192698]: 2025-10-01 13:53:08.981 2 DEBUG oslo_concurrency.lockutils [None req-d41344e6-afd1-4552-bc93-1f06ca4fddfe - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.895s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 13:53:08 compute-0 nova_compute[192698]: 2025-10-01 13:53:08.982 2 DEBUG nova.service [None req-d41344e6-afd1-4552-bc93-1f06ca4fddfe - - - - - -] Creating RPC server for service compute start /usr/lib/python3.12/site-packages/nova/service.py:177
Oct 01 13:53:09 compute-0 nova_compute[192698]: 2025-10-01 13:53:09.095 2 DEBUG nova.service [None req-d41344e6-afd1-4552-bc93-1f06ca4fddfe - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.12/site-packages/nova/service.py:194
Oct 01 13:53:09 compute-0 nova_compute[192698]: 2025-10-01 13:53:09.096 2 DEBUG nova.servicegroup.drivers.db [None req-d41344e6-afd1-4552-bc93-1f06ca4fddfe - - - - - -] DB_Driver: join new ServiceGroup member compute-0.ctlplane.example.com to the compute group, service = <Service: host=compute-0.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.12/site-packages/nova/servicegroup/drivers/db.py:44
Oct 01 13:53:13 compute-0 sudo[193847]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ooqefvcxkiedwxybfjpjbjstvbnhtwkw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326793.4558682-90-198303195446568/AnsiballZ_systemd_service.py'
Oct 01 13:53:13 compute-0 sudo[193847]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:53:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:53:14.204 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 13:53:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:53:14.206 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 13:53:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:53:14.206 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 13:53:14 compute-0 python3.9[193849]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 01 13:53:14 compute-0 sudo[193847]: pam_unix(sudo:session): session closed for user root
Oct 01 13:53:15 compute-0 sudo[194001]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ecouhohcjxbiwshqpxktmeenhggnnwnc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326794.7134335-110-69760944850099/AnsiballZ_file.py'
Oct 01 13:53:15 compute-0 sudo[194001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:53:15 compute-0 python3.9[194003]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:53:15 compute-0 sudo[194001]: pam_unix(sudo:session): session closed for user root
Oct 01 13:53:15 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 01 13:53:15 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 01 13:53:16 compute-0 sudo[194154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgfhuylzilgkwgbfmngpxpjdgwbqkenn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326795.711378-126-158820005628507/AnsiballZ_file.py'
Oct 01 13:53:16 compute-0 sudo[194154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:53:16 compute-0 python3.9[194156]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:53:16 compute-0 sudo[194154]: pam_unix(sudo:session): session closed for user root
Oct 01 13:53:17 compute-0 sudo[194306]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mqorofkdgnekocjojdaarohvalqmzkmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326796.6046112-144-82238503226749/AnsiballZ_command.py'
Oct 01 13:53:17 compute-0 sudo[194306]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:53:17 compute-0 python3.9[194308]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:53:17 compute-0 sudo[194306]: pam_unix(sudo:session): session closed for user root
Oct 01 13:53:18 compute-0 python3.9[194460]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct 01 13:53:19 compute-0 sudo[194610]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzsibidtogijdkihsehhbeqtiahehyof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326798.7268746-180-140456528132107/AnsiballZ_systemd_service.py'
Oct 01 13:53:19 compute-0 sudo[194610]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:53:19 compute-0 python3.9[194612]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 01 13:53:19 compute-0 systemd[1]: Reloading.
Oct 01 13:53:19 compute-0 systemd-sysv-generator[194643]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:53:19 compute-0 systemd-rc-local-generator[194640]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:53:19 compute-0 sudo[194610]: pam_unix(sudo:session): session closed for user root
Oct 01 13:53:20 compute-0 sudo[194797]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uerkmkoggawsbohenxwdyffuinnxifvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326799.982099-196-130470615759585/AnsiballZ_command.py'
Oct 01 13:53:20 compute-0 sudo[194797]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:53:20 compute-0 python3.9[194799]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:53:20 compute-0 sudo[194797]: pam_unix(sudo:session): session closed for user root
Oct 01 13:53:21 compute-0 podman[194878]: 2025-10-01 13:53:21.225289631 +0000 UTC m=+0.122525730 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 01 13:53:21 compute-0 podman[194890]: 2025-10-01 13:53:21.271688109 +0000 UTC m=+0.166414932 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Oct 01 13:53:21 compute-0 sudo[194996]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzvrjamurpfhhjrnagoeaxqderjpywfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326800.9333332-214-38529448877495/AnsiballZ_file.py'
Oct 01 13:53:21 compute-0 sudo[194996]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:53:21 compute-0 python3.9[194998]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:53:21 compute-0 sudo[194996]: pam_unix(sudo:session): session closed for user root
Oct 01 13:53:22 compute-0 python3.9[195148]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 01 13:53:23 compute-0 python3.9[195300]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:53:24 compute-0 python3.9[195421]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759326802.7649038-246-93907122076014/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=e86e0e43000ce9ccfe5aefbf8e8f2e3d15d05584 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:53:24 compute-0 sudo[195571]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-usmtbiznyhmxjwrbinctvigxddycvuby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326804.259201-276-162217830849785/AnsiballZ_group.py'
Oct 01 13:53:24 compute-0 sudo[195571]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:53:24 compute-0 python3.9[195573]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Oct 01 13:53:25 compute-0 sudo[195571]: pam_unix(sudo:session): session closed for user root
Oct 01 13:53:25 compute-0 sudo[195723]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnqjafqjcrccqigpdnbolkzzgdxtmsjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326805.3572342-298-47446390728085/AnsiballZ_getent.py'
Oct 01 13:53:25 compute-0 sudo[195723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:53:26 compute-0 python3.9[195725]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Oct 01 13:53:26 compute-0 sudo[195723]: pam_unix(sudo:session): session closed for user root
Oct 01 13:53:26 compute-0 sudo[195876]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkcqmvelgxpttkdrcihmnflscokniqox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326806.2825775-314-212533197375780/AnsiballZ_group.py'
Oct 01 13:53:26 compute-0 auditd[703]: Audit daemon rotating log files
Oct 01 13:53:26 compute-0 sudo[195876]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:53:26 compute-0 python3.9[195878]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct 01 13:53:26 compute-0 groupadd[195879]: group added to /etc/group: name=ceilometer, GID=42405
Oct 01 13:53:26 compute-0 groupadd[195879]: group added to /etc/gshadow: name=ceilometer
Oct 01 13:53:26 compute-0 groupadd[195879]: new group: name=ceilometer, GID=42405
Oct 01 13:53:27 compute-0 sudo[195876]: pam_unix(sudo:session): session closed for user root
Oct 01 13:53:27 compute-0 sudo[196034]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxltpvsbnpizouuhaqymhiynuczqtegm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326807.224504-330-101203997190151/AnsiballZ_user.py'
Oct 01 13:53:27 compute-0 sudo[196034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:53:28 compute-0 python3.9[196036]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct 01 13:53:28 compute-0 useradd[196038]: new user: name=ceilometer, UID=42405, GID=42405, home=/home/ceilometer, shell=/sbin/nologin, from=/dev/pts/0
Oct 01 13:53:28 compute-0 useradd[196038]: add 'ceilometer' to group 'libvirt'
Oct 01 13:53:28 compute-0 useradd[196038]: add 'ceilometer' to shadow group 'libvirt'
Oct 01 13:53:28 compute-0 sudo[196034]: pam_unix(sudo:session): session closed for user root
Oct 01 13:53:29 compute-0 python3.9[196194]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:53:30 compute-0 python3.9[196315]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1759326809.1156206-382-106134387103794/.source.conf _original_basename=ceilometer.conf follow=False checksum=f74f01c63e6cdeca5458ef9aff2a1db5d6a4e4b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:53:31 compute-0 python3.9[196465]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:53:31 compute-0 python3.9[196586]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1759326810.5144076-382-101867152254417/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:53:32 compute-0 python3.9[196736]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:53:33 compute-0 python3.9[196857]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1759326811.834543-382-257699988255222/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:53:33 compute-0 python3.9[197007]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 01 13:53:34 compute-0 python3.9[197159]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 01 13:53:35 compute-0 podman[197262]: 2025-10-01 13:53:35.174406349 +0000 UTC m=+0.083741161 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20250930, tcib_managed=true, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Oct 01 13:53:35 compute-0 podman[197267]: 2025-10-01 13:53:35.181490228 +0000 UTC m=+0.089903506 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct 01 13:53:35 compute-0 python3.9[197353]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:53:36 compute-0 python3.9[197474]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759326814.8158598-500-51471158388410/.source.json follow=False _original_basename=ceilometer-agent-compute.json.j2 checksum=264d11e8d3809e7ef745878dce7edd46098e25b2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:53:36 compute-0 python3.9[197624]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:53:37 compute-0 python3.9[197700]: ansible-ansible.legacy.file Invoked with mode=420 dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf _original_basename=ceilometer-host-specific.conf.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:53:38 compute-0 python3.9[197850]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:53:38 compute-0 python3.9[197971]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759326817.6381521-500-169336842762654/.source.json follow=False _original_basename=ceilometer_agent_compute.json.j2 checksum=ffa17f3239e0dd55ed7347cc28623909421f3090 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:53:39 compute-0 python3.9[198121]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:53:40 compute-0 python3.9[198242]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759326819.1842918-500-203953905893045/.source.yaml follow=False _original_basename=ceilometer_prom_exporter.yaml.j2 checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:53:41 compute-0 python3.9[198392]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:53:42 compute-0 python3.9[198513]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/firewall.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759326820.7106287-500-24569041043761/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:53:42 compute-0 python3.9[198663]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:53:43 compute-0 python3.9[198784]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759326822.1958032-500-269382688201470/.source.json follow=False _original_basename=node_exporter.json.j2 checksum=6e4982940d2bfae88404914dfaf72552f6356d81 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:53:44 compute-0 python3.9[198934]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:53:44 compute-0 python3.9[199055]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759326823.6115906-500-16562697143046/.source.yaml follow=False _original_basename=node_exporter.yaml.j2 checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:53:45 compute-0 python3.9[199205]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:53:46 compute-0 python3.9[199326]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759326824.9527073-500-157623035247242/.source.json follow=False _original_basename=openstack_network_exporter.json.j2 checksum=d474f1e4c3dbd24762592c51cbe5311f0a037273 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:53:47 compute-0 python3.9[199476]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:53:47 compute-0 python3.9[199597]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759326826.562262-500-52038996617170/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=2b6bd0891e609bf38a73282f42888052b750bed6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:53:48 compute-0 python3.9[199747]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:53:49 compute-0 python3.9[199868]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759326828.1097116-500-256235311970785/.source.json follow=False _original_basename=podman_exporter.json.j2 checksum=e342121a88f67e2bae7ebc05d1e6d350470198a5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:53:50 compute-0 python3.9[200018]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:53:50 compute-0 python3.9[200139]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759326829.600018-500-113294235475395/.source.yaml follow=False _original_basename=podman_exporter.yaml.j2 checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:53:51 compute-0 podman[200263]: 2025-10-01 13:53:51.688067778 +0000 UTC m=+0.066379180 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, io.buildah.version=1.41.4)
Oct 01 13:53:51 compute-0 podman[200264]: 2025-10-01 13:53:51.782718409 +0000 UTC m=+0.148893747 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest)
Oct 01 13:53:51 compute-0 python3.9[200317]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:53:52 compute-0 python3.9[200406]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/node_exporter.yaml _original_basename=node_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/node_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:53:53 compute-0 python3.9[200556]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:53:53 compute-0 python3.9[200632]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml _original_basename=podman_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/podman_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:53:54 compute-0 python3.9[200782]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:53:55 compute-0 python3.9[200858]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml _original_basename=ceilometer_prom_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:53:55 compute-0 sudo[201008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwsnaggwjuxljyccibzmanymmsxdmhpq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326835.5170732-878-142546551802358/AnsiballZ_file.py'
Oct 01 13:53:55 compute-0 sudo[201008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:53:56 compute-0 python3.9[201010]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:53:56 compute-0 sudo[201008]: pam_unix(sudo:session): session closed for user root
Oct 01 13:53:56 compute-0 sudo[201160]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-roezxvtkyslmpjtxcemokgnxqxnvaslh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326836.4211185-894-221600136474137/AnsiballZ_file.py'
Oct 01 13:53:56 compute-0 sudo[201160]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:53:57 compute-0 python3.9[201162]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:53:57 compute-0 sudo[201160]: pam_unix(sudo:session): session closed for user root
Oct 01 13:53:57 compute-0 sudo[201312]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxwsliyynyuvcthamuyfubndbckvvoso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326837.2519722-910-16259755845173/AnsiballZ_file.py'
Oct 01 13:53:57 compute-0 sudo[201312]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:53:57 compute-0 python3.9[201314]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:53:57 compute-0 sudo[201312]: pam_unix(sudo:session): session closed for user root
Oct 01 13:53:58 compute-0 nova_compute[192698]: 2025-10-01 13:53:58.098 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 13:53:58 compute-0 sudo[201464]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfohqkoidunhyjevugcucxrgjphzpibq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326838.0968645-926-164591264126468/AnsiballZ_systemd_service.py'
Oct 01 13:53:58 compute-0 sudo[201464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:53:58 compute-0 nova_compute[192698]: 2025-10-01 13:53:58.611 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 13:53:58 compute-0 python3.9[201466]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 01 13:53:58 compute-0 nova_compute[192698]: 2025-10-01 13:53:58.930 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 13:53:58 compute-0 nova_compute[192698]: 2025-10-01 13:53:58.930 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 13:53:58 compute-0 nova_compute[192698]: 2025-10-01 13:53:58.930 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 13:53:58 compute-0 nova_compute[192698]: 2025-10-01 13:53:58.931 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 13:53:58 compute-0 nova_compute[192698]: 2025-10-01 13:53:58.931 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 13:53:58 compute-0 nova_compute[192698]: 2025-10-01 13:53:58.931 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 13:53:58 compute-0 nova_compute[192698]: 2025-10-01 13:53:58.931 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 13:53:58 compute-0 nova_compute[192698]: 2025-10-01 13:53:58.931 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 01 13:53:58 compute-0 nova_compute[192698]: 2025-10-01 13:53:58.932 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 13:53:58 compute-0 systemd[1]: Reloading.
Oct 01 13:53:59 compute-0 systemd-sysv-generator[201496]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:53:59 compute-0 systemd-rc-local-generator[201493]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:53:59 compute-0 systemd[1]: Listening on Podman API Socket.
Oct 01 13:53:59 compute-0 sudo[201464]: pam_unix(sudo:session): session closed for user root
Oct 01 13:53:59 compute-0 nova_compute[192698]: 2025-10-01 13:53:59.448 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 13:53:59 compute-0 nova_compute[192698]: 2025-10-01 13:53:59.450 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 13:53:59 compute-0 nova_compute[192698]: 2025-10-01 13:53:59.451 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 13:53:59 compute-0 nova_compute[192698]: 2025-10-01 13:53:59.451 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 01 13:53:59 compute-0 nova_compute[192698]: 2025-10-01 13:53:59.652 2 WARNING nova.virt.libvirt.driver [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 13:53:59 compute-0 nova_compute[192698]: 2025-10-01 13:53:59.653 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 13:53:59 compute-0 nova_compute[192698]: 2025-10-01 13:53:59.703 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.049s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 13:53:59 compute-0 nova_compute[192698]: 2025-10-01 13:53:59.704 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6170MB free_disk=73.51251602172852GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 01 13:53:59 compute-0 nova_compute[192698]: 2025-10-01 13:53:59.704 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 13:53:59 compute-0 nova_compute[192698]: 2025-10-01 13:53:59.704 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 13:54:00 compute-0 sudo[201655]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxhyqbmkphgrbnravsfuwuqqyygkvkfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326839.6563652-944-63364422077140/AnsiballZ_stat.py'
Oct 01 13:54:00 compute-0 sudo[201655]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:54:00 compute-0 python3.9[201657]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:54:00 compute-0 sudo[201655]: pam_unix(sudo:session): session closed for user root
Oct 01 13:54:00 compute-0 sudo[201778]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fszgkyeslgfyuetowcziakoschvgzcoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326839.6563652-944-63364422077140/AnsiballZ_copy.py'
Oct 01 13:54:00 compute-0 sudo[201778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:54:00 compute-0 nova_compute[192698]: 2025-10-01 13:54:00.758 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 01 13:54:00 compute-0 nova_compute[192698]: 2025-10-01 13:54:00.758 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 13:53:59 up 53 min,  0 user,  load average: 0.92, 0.89, 0.70\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 01 13:54:00 compute-0 nova_compute[192698]: 2025-10-01 13:54:00.848 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 13:54:00 compute-0 python3.9[201780]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759326839.6563652-944-63364422077140/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:54:01 compute-0 sudo[201778]: pam_unix(sudo:session): session closed for user root
Oct 01 13:54:01 compute-0 nova_compute[192698]: 2025-10-01 13:54:01.357 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 13:54:01 compute-0 nova_compute[192698]: 2025-10-01 13:54:01.872 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 01 13:54:01 compute-0 nova_compute[192698]: 2025-10-01 13:54:01.872 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.168s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 13:54:01 compute-0 sudo[201930]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zilgdhlqjklkacdgohxtpltlahcnzptm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326841.422556-978-220294448449149/AnsiballZ_container_config_data.py'
Oct 01 13:54:01 compute-0 sudo[201930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:54:02 compute-0 python3.9[201932]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=podman_exporter.json debug=False
Oct 01 13:54:02 compute-0 sudo[201930]: pam_unix(sudo:session): session closed for user root
Oct 01 13:54:02 compute-0 sudo[202082]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ripxqhfkflmnzxfpqgoeudbsoscgcqgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326842.418928-996-141458859278734/AnsiballZ_container_config_hash.py'
Oct 01 13:54:02 compute-0 sudo[202082]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:54:03 compute-0 python3.9[202084]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 01 13:54:03 compute-0 sudo[202082]: pam_unix(sudo:session): session closed for user root
Oct 01 13:54:04 compute-0 sudo[202234]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oksnxudgfuegvbqtniyaluzrgyaohnwh ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759326843.514954-1016-50387489356676/AnsiballZ_edpm_container_manage.py'
Oct 01 13:54:04 compute-0 sudo[202234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:54:04 compute-0 python3[202236]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=podman_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Oct 01 13:54:05 compute-0 podman[202291]: 2025-10-01 13:54:05.582633286 +0000 UTC m=+0.124170959 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible, container_name=iscsid, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 01 13:54:05 compute-0 podman[202292]: 2025-10-01 13:54:05.842536689 +0000 UTC m=+0.369538305 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0)
Oct 01 13:54:06 compute-0 podman[202248]: 2025-10-01 13:54:06.123895214 +0000 UTC m=+1.633498165 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Oct 01 13:54:06 compute-0 podman[202389]: 2025-10-01 13:54:06.330002834 +0000 UTC m=+0.073852458 container create a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, config_id=edpm, container_name=podman_exporter, managed_by=edpm_ansible)
Oct 01 13:54:06 compute-0 podman[202389]: 2025-10-01 13:54:06.298522416 +0000 UTC m=+0.042372050 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Oct 01 13:54:06 compute-0 python3[202236]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env OS_ENDPOINT_TYPE=internal --env CONTAINER_HOST=unix:///run/podman/podman.sock --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=edpm --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter:v1.10.1 --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Oct 01 13:54:06 compute-0 sudo[202234]: pam_unix(sudo:session): session closed for user root
Oct 01 13:54:07 compute-0 sudo[202580]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-doycmzsmtmpnbafwfidzklxwwxenjifd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326846.7368898-1032-173296988139154/AnsiballZ_stat.py'
Oct 01 13:54:07 compute-0 sudo[202580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:54:07 compute-0 unix_chkpwd[202583]: password check failed for user (root)
Oct 01 13:54:07 compute-0 sshd-session[202429]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.176  user=root
Oct 01 13:54:07 compute-0 python3.9[202582]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 01 13:54:07 compute-0 sudo[202580]: pam_unix(sudo:session): session closed for user root
Oct 01 13:54:08 compute-0 sudo[202735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghfhgmvaiafnhcmynfmyjebertknmrli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326847.7470431-1050-167818636590295/AnsiballZ_file.py'
Oct 01 13:54:08 compute-0 sudo[202735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:54:08 compute-0 python3.9[202737]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:54:08 compute-0 sudo[202735]: pam_unix(sudo:session): session closed for user root
Oct 01 13:54:09 compute-0 sudo[202886]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhksbunrpoortcxgkmtuoxwrzylanvpb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326848.4838963-1050-265809678047983/AnsiballZ_copy.py'
Oct 01 13:54:09 compute-0 sudo[202886]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:54:09 compute-0 python3.9[202888]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759326848.4838963-1050-265809678047983/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:54:09 compute-0 sudo[202886]: pam_unix(sudo:session): session closed for user root
Oct 01 13:54:09 compute-0 sshd-session[202429]: Failed password for root from 80.94.93.176 port 58468 ssh2
Oct 01 13:54:09 compute-0 sudo[202962]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwragcyhgytgkzkofxsgxmrzgmrclzbt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326848.4838963-1050-265809678047983/AnsiballZ_systemd.py'
Oct 01 13:54:09 compute-0 sudo[202962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:54:10 compute-0 unix_chkpwd[202965]: password check failed for user (root)
Oct 01 13:54:10 compute-0 python3.9[202964]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 01 13:54:10 compute-0 systemd[1]: Reloading.
Oct 01 13:54:10 compute-0 systemd-sysv-generator[202994]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:54:10 compute-0 systemd-rc-local-generator[202989]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:54:10 compute-0 sudo[202962]: pam_unix(sudo:session): session closed for user root
Oct 01 13:54:10 compute-0 sudo[203075]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mpyksfbtkhtqbbmtjltwbwzridgkmkty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326848.4838963-1050-265809678047983/AnsiballZ_systemd.py'
Oct 01 13:54:10 compute-0 sudo[203075]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:54:11 compute-0 python3.9[203077]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 01 13:54:11 compute-0 systemd[1]: Reloading.
Oct 01 13:54:11 compute-0 systemd-rc-local-generator[203108]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:54:11 compute-0 systemd-sysv-generator[203112]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:54:11 compute-0 systemd[1]: Starting podman_exporter container...
Oct 01 13:54:11 compute-0 systemd[1]: Started libcrun container.
Oct 01 13:54:11 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/070b304ab095524169447cd9f0efc9a2c0196d3f146f4a1f52709b4ea9d8d351/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct 01 13:54:11 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/070b304ab095524169447cd9f0efc9a2c0196d3f146f4a1f52709b4ea9d8d351/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Oct 01 13:54:11 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e.
Oct 01 13:54:11 compute-0 podman[203118]: 2025-10-01 13:54:11.875697782 +0000 UTC m=+0.190274219 container init a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 01 13:54:11 compute-0 podman_exporter[203133]: ts=2025-10-01T13:54:11.898Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Oct 01 13:54:11 compute-0 podman_exporter[203133]: ts=2025-10-01T13:54:11.898Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Oct 01 13:54:11 compute-0 podman_exporter[203133]: ts=2025-10-01T13:54:11.898Z caller=handler.go:94 level=info msg="enabled collectors"
Oct 01 13:54:11 compute-0 podman_exporter[203133]: ts=2025-10-01T13:54:11.898Z caller=handler.go:105 level=info collector=container
Oct 01 13:54:11 compute-0 podman[203118]: 2025-10-01 13:54:11.912579485 +0000 UTC m=+0.227155832 container start a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 01 13:54:11 compute-0 podman[203118]: podman_exporter
Oct 01 13:54:11 compute-0 systemd[1]: Starting Podman API Service...
Oct 01 13:54:11 compute-0 systemd[1]: Started podman_exporter container.
Oct 01 13:54:11 compute-0 systemd[1]: Started Podman API Service.
Oct 01 13:54:11 compute-0 sshd-session[202429]: Failed password for root from 80.94.93.176 port 58468 ssh2
Oct 01 13:54:11 compute-0 sudo[203075]: pam_unix(sudo:session): session closed for user root
Oct 01 13:54:11 compute-0 podman[203144]: time="2025-10-01T13:54:11Z" level=info msg="/usr/bin/podman filtering at log level info"
Oct 01 13:54:11 compute-0 podman[203144]: time="2025-10-01T13:54:11Z" level=info msg="Setting parallel job count to 25"
Oct 01 13:54:11 compute-0 podman[203144]: time="2025-10-01T13:54:11Z" level=info msg="Using sqlite as database backend"
Oct 01 13:54:11 compute-0 podman[203144]: time="2025-10-01T13:54:11Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Oct 01 13:54:11 compute-0 podman[203144]: time="2025-10-01T13:54:11Z" level=info msg="Using systemd socket activation to determine API endpoint"
Oct 01 13:54:11 compute-0 podman[203144]: time="2025-10-01T13:54:11Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Oct 01 13:54:12 compute-0 podman[203144]: @ - - [01/Oct/2025:13:54:12 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Oct 01 13:54:12 compute-0 podman[203144]: time="2025-10-01T13:54:12Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 13:54:12 compute-0 podman[203144]: @ - - [01/Oct/2025:13:54:12 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 16536 "" "Go-http-client/1.1"
Oct 01 13:54:12 compute-0 podman_exporter[203133]: ts=2025-10-01T13:54:12.041Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Oct 01 13:54:12 compute-0 podman_exporter[203133]: ts=2025-10-01T13:54:12.042Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Oct 01 13:54:12 compute-0 podman_exporter[203133]: ts=2025-10-01T13:54:12.043Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Oct 01 13:54:12 compute-0 podman[203142]: 2025-10-01 13:54:12.053657723 +0000 UTC m=+0.119484854 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 01 13:54:12 compute-0 systemd[1]: a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e-20a28bb2d8334b10.service: Main process exited, code=exited, status=1/FAILURE
Oct 01 13:54:12 compute-0 systemd[1]: a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e-20a28bb2d8334b10.service: Failed with result 'exit-code'.
Oct 01 13:54:12 compute-0 sudo[203333]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-feiafvjbhvefvdjhoxyfibhcgfgrvhwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326852.2031896-1098-280274911329172/AnsiballZ_systemd.py'
Oct 01 13:54:12 compute-0 sudo[203333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:54:12 compute-0 unix_chkpwd[203336]: password check failed for user (root)
Oct 01 13:54:12 compute-0 python3.9[203335]: ansible-ansible.builtin.systemd Invoked with name=edpm_podman_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 01 13:54:12 compute-0 systemd[1]: Stopping podman_exporter container...
Oct 01 13:54:13 compute-0 podman[203144]: @ - - [01/Oct/2025:13:54:12 +0000] "GET /v4.9.3/libpod/events?filters=%7B%7D&since=&stream=true&until= HTTP/1.1" 200 1449 "" "Go-http-client/1.1"
Oct 01 13:54:13 compute-0 systemd[1]: libpod-a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e.scope: Deactivated successfully.
Oct 01 13:54:13 compute-0 podman[203340]: 2025-10-01 13:54:13.063146874 +0000 UTC m=+0.050140877 container died a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 01 13:54:13 compute-0 systemd[1]: a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e-20a28bb2d8334b10.timer: Deactivated successfully.
Oct 01 13:54:13 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e.
Oct 01 13:54:13 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e-userdata-shm.mount: Deactivated successfully.
Oct 01 13:54:13 compute-0 systemd[1]: var-lib-containers-storage-overlay-070b304ab095524169447cd9f0efc9a2c0196d3f146f4a1f52709b4ea9d8d351-merged.mount: Deactivated successfully.
Oct 01 13:54:13 compute-0 podman[203340]: 2025-10-01 13:54:13.249095347 +0000 UTC m=+0.236089310 container cleanup a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 01 13:54:13 compute-0 podman[203340]: podman_exporter
Oct 01 13:54:13 compute-0 systemd[1]: edpm_podman_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Oct 01 13:54:13 compute-0 podman[203367]: podman_exporter
Oct 01 13:54:13 compute-0 systemd[1]: edpm_podman_exporter.service: Failed with result 'exit-code'.
Oct 01 13:54:13 compute-0 systemd[1]: Stopped podman_exporter container.
Oct 01 13:54:13 compute-0 systemd[1]: Starting podman_exporter container...
Oct 01 13:54:13 compute-0 systemd[1]: Started libcrun container.
Oct 01 13:54:13 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/070b304ab095524169447cd9f0efc9a2c0196d3f146f4a1f52709b4ea9d8d351/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct 01 13:54:13 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/070b304ab095524169447cd9f0efc9a2c0196d3f146f4a1f52709b4ea9d8d351/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Oct 01 13:54:13 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e.
Oct 01 13:54:13 compute-0 podman[203380]: 2025-10-01 13:54:13.539667656 +0000 UTC m=+0.174259102 container init a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 01 13:54:13 compute-0 podman[203144]: @ - - [01/Oct/2025:13:54:13 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Oct 01 13:54:13 compute-0 podman_exporter[203396]: ts=2025-10-01T13:54:13.561Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Oct 01 13:54:13 compute-0 podman[203144]: time="2025-10-01T13:54:13Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 13:54:13 compute-0 podman_exporter[203396]: ts=2025-10-01T13:54:13.561Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Oct 01 13:54:13 compute-0 podman_exporter[203396]: ts=2025-10-01T13:54:13.562Z caller=handler.go:94 level=info msg="enabled collectors"
Oct 01 13:54:13 compute-0 podman_exporter[203396]: ts=2025-10-01T13:54:13.562Z caller=handler.go:105 level=info collector=container
Oct 01 13:54:13 compute-0 podman[203380]: 2025-10-01 13:54:13.583855633 +0000 UTC m=+0.218447109 container start a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 01 13:54:13 compute-0 podman[203380]: podman_exporter
Oct 01 13:54:13 compute-0 podman[203144]: @ - - [01/Oct/2025:13:54:13 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 16538 "" "Go-http-client/1.1"
Oct 01 13:54:13 compute-0 podman_exporter[203396]: ts=2025-10-01T13:54:13.594Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Oct 01 13:54:13 compute-0 podman_exporter[203396]: ts=2025-10-01T13:54:13.595Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Oct 01 13:54:13 compute-0 podman_exporter[203396]: ts=2025-10-01T13:54:13.596Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Oct 01 13:54:13 compute-0 systemd[1]: Started podman_exporter container.
Oct 01 13:54:13 compute-0 sudo[203333]: pam_unix(sudo:session): session closed for user root
Oct 01 13:54:13 compute-0 podman[203406]: 2025-10-01 13:54:13.694785448 +0000 UTC m=+0.091898779 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 01 13:54:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:54:14.211 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 13:54:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:54:14.211 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 13:54:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:54:14.211 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 13:54:15 compute-0 sshd-session[202429]: Failed password for root from 80.94.93.176 port 58468 ssh2
Oct 01 13:54:15 compute-0 sudo[203580]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfxdchdsdabymtdtigirwbwtklafiuzg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326854.8338895-1114-109838454257358/AnsiballZ_stat.py'
Oct 01 13:54:15 compute-0 sudo[203580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:54:15 compute-0 python3.9[203582]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:54:15 compute-0 sudo[203580]: pam_unix(sudo:session): session closed for user root
Oct 01 13:54:15 compute-0 sshd-session[202429]: Received disconnect from 80.94.93.176 port 58468:11:  [preauth]
Oct 01 13:54:15 compute-0 sshd-session[202429]: Disconnected from authenticating user root 80.94.93.176 port 58468 [preauth]
Oct 01 13:54:15 compute-0 sshd-session[202429]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.176  user=root
Oct 01 13:54:15 compute-0 sudo[203705]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfgzjchqqgdstwzaetkscoiwwiigvyko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326854.8338895-1114-109838454257358/AnsiballZ_copy.py'
Oct 01 13:54:15 compute-0 sudo[203705]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:54:16 compute-0 python3.9[203707]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759326854.8338895-1114-109838454257358/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct 01 13:54:16 compute-0 sudo[203705]: pam_unix(sudo:session): session closed for user root
Oct 01 13:54:16 compute-0 unix_chkpwd[203744]: password check failed for user (root)
Oct 01 13:54:16 compute-0 sshd-session[203658]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.176  user=root
Oct 01 13:54:16 compute-0 sudo[203858]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kniicyegtechepshsvrgjdnlfrgwptxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326856.5945911-1148-114201849823712/AnsiballZ_container_config_data.py'
Oct 01 13:54:16 compute-0 sudo[203858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:54:17 compute-0 python3.9[203860]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=openstack_network_exporter.json debug=False
Oct 01 13:54:17 compute-0 sudo[203858]: pam_unix(sudo:session): session closed for user root
Oct 01 13:54:17 compute-0 sudo[204010]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxqzpktidvbzvlizkrvpwdcjgowgnbsg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326857.548063-1166-145684611141130/AnsiballZ_container_config_hash.py'
Oct 01 13:54:17 compute-0 sudo[204010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:54:18 compute-0 python3.9[204012]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct 01 13:54:18 compute-0 sshd-session[203658]: Failed password for root from 80.94.93.176 port 37316 ssh2
Oct 01 13:54:18 compute-0 sudo[204010]: pam_unix(sudo:session): session closed for user root
Oct 01 13:54:19 compute-0 sudo[204162]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngajhyhbpwrkfwwaejhasrdpwabqqngw ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759326858.6534617-1186-197807351787183/AnsiballZ_edpm_container_manage.py'
Oct 01 13:54:19 compute-0 sudo[204162]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:54:19 compute-0 python3[204164]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=openstack_network_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Oct 01 13:54:19 compute-0 unix_chkpwd[204176]: password check failed for user (root)
Oct 01 13:54:21 compute-0 sshd-session[203658]: Failed password for root from 80.94.93.176 port 37316 ssh2
Oct 01 13:54:22 compute-0 unix_chkpwd[204289]: password check failed for user (root)
Oct 01 13:54:22 compute-0 podman[204238]: 2025-10-01 13:54:22.196445719 +0000 UTC m=+0.208062583 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Oct 01 13:54:22 compute-0 podman[204239]: 2025-10-01 13:54:22.234073981 +0000 UTC m=+0.241281158 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 01 13:54:22 compute-0 podman[204178]: 2025-10-01 13:54:22.361873096 +0000 UTC m=+2.938233850 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Oct 01 13:54:22 compute-0 podman[204321]: 2025-10-01 13:54:22.608439744 +0000 UTC m=+0.080888026 container create e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, config_id=edpm, vcs-type=git, managed_by=edpm_ansible, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., distribution-scope=public, version=9.6, com.redhat.component=ubi9-minimal-container, architecture=x86_64)
Oct 01 13:54:22 compute-0 podman[204321]: 2025-10-01 13:54:22.568292944 +0000 UTC m=+0.040741226 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Oct 01 13:54:22 compute-0 python3[204164]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OS_ENDPOINT_TYPE=internal --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=edpm --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Oct 01 13:54:22 compute-0 sudo[204162]: pam_unix(sudo:session): session closed for user root
Oct 01 13:54:23 compute-0 sudo[204507]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nuizltmsvlgnghfcnfzzotznsqxulcnf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326863.0308316-1202-137487520270873/AnsiballZ_stat.py'
Oct 01 13:54:23 compute-0 sudo[204507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:54:23 compute-0 python3.9[204509]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 01 13:54:23 compute-0 sudo[204507]: pam_unix(sudo:session): session closed for user root
Oct 01 13:54:24 compute-0 sshd-session[203658]: Failed password for root from 80.94.93.176 port 37316 ssh2
Oct 01 13:54:24 compute-0 sudo[204661]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpmrmgjvxnqasovovxguyrypvjefkuqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326863.9824429-1220-102239891529223/AnsiballZ_file.py'
Oct 01 13:54:24 compute-0 sudo[204661]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:54:24 compute-0 python3.9[204663]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:54:24 compute-0 sudo[204661]: pam_unix(sudo:session): session closed for user root
Oct 01 13:54:24 compute-0 sshd-session[203658]: Received disconnect from 80.94.93.176 port 37316:11:  [preauth]
Oct 01 13:54:24 compute-0 sshd-session[203658]: Disconnected from authenticating user root 80.94.93.176 port 37316 [preauth]
Oct 01 13:54:24 compute-0 sshd-session[203658]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.176  user=root
Oct 01 13:54:25 compute-0 sudo[204814]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzpbbqoemxgsrjkcxnislrnjzyrkqfhi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326864.7165046-1220-174222982237327/AnsiballZ_copy.py'
Oct 01 13:54:25 compute-0 sudo[204814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:54:25 compute-0 python3.9[204816]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759326864.7165046-1220-174222982237327/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:54:25 compute-0 sudo[204814]: pam_unix(sudo:session): session closed for user root
Oct 01 13:54:25 compute-0 unix_chkpwd[204867]: password check failed for user (root)
Oct 01 13:54:25 compute-0 sshd-session[204762]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.176  user=root
Oct 01 13:54:25 compute-0 sudo[204891]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvapvqswrenhoymvhjqnwqdunrqioqyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326864.7165046-1220-174222982237327/AnsiballZ_systemd.py'
Oct 01 13:54:25 compute-0 sudo[204891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:54:26 compute-0 python3.9[204893]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct 01 13:54:26 compute-0 systemd[1]: Reloading.
Oct 01 13:54:26 compute-0 systemd-rc-local-generator[204921]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:54:26 compute-0 systemd-sysv-generator[204925]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:54:26 compute-0 sudo[204891]: pam_unix(sudo:session): session closed for user root
Oct 01 13:54:26 compute-0 sudo[205001]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwspwncujnoujszhazvtjykmeztyshxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326864.7165046-1220-174222982237327/AnsiballZ_systemd.py'
Oct 01 13:54:26 compute-0 sudo[205001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:54:27 compute-0 python3.9[205003]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct 01 13:54:27 compute-0 systemd[1]: Reloading.
Oct 01 13:54:27 compute-0 systemd-rc-local-generator[205028]: /etc/rc.d/rc.local is not marked executable, skipping.
Oct 01 13:54:27 compute-0 systemd-sysv-generator[205032]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct 01 13:54:27 compute-0 systemd[1]: Starting openstack_network_exporter container...
Oct 01 13:54:27 compute-0 systemd[1]: Started libcrun container.
Oct 01 13:54:27 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5197ae8e65e82dd2ddbb6cfeb9e091b08e83dc7402cdc219a04e9695887493b0/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Oct 01 13:54:27 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5197ae8e65e82dd2ddbb6cfeb9e091b08e83dc7402cdc219a04e9695887493b0/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct 01 13:54:27 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5197ae8e65e82dd2ddbb6cfeb9e091b08e83dc7402cdc219a04e9695887493b0/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Oct 01 13:54:27 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c.
Oct 01 13:54:27 compute-0 podman[205044]: 2025-10-01 13:54:27.911072098 +0000 UTC m=+0.202422520 container init e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.buildah.version=1.33.7, name=ubi9-minimal, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, version=9.6)
Oct 01 13:54:27 compute-0 openstack_network_exporter[205060]: INFO    13:54:27 main.go:48: registering *bridge.Collector
Oct 01 13:54:27 compute-0 openstack_network_exporter[205060]: INFO    13:54:27 main.go:48: registering *coverage.Collector
Oct 01 13:54:27 compute-0 openstack_network_exporter[205060]: INFO    13:54:27 main.go:48: registering *datapath.Collector
Oct 01 13:54:27 compute-0 openstack_network_exporter[205060]: INFO    13:54:27 main.go:48: registering *iface.Collector
Oct 01 13:54:27 compute-0 openstack_network_exporter[205060]: INFO    13:54:27 main.go:48: registering *memory.Collector
Oct 01 13:54:27 compute-0 openstack_network_exporter[205060]: INFO    13:54:27 main.go:48: registering *ovnnorthd.Collector
Oct 01 13:54:27 compute-0 openstack_network_exporter[205060]: INFO    13:54:27 main.go:48: registering *ovn.Collector
Oct 01 13:54:27 compute-0 openstack_network_exporter[205060]: INFO    13:54:27 main.go:48: registering *ovsdbserver.Collector
Oct 01 13:54:27 compute-0 openstack_network_exporter[205060]: INFO    13:54:27 main.go:48: registering *pmd_perf.Collector
Oct 01 13:54:27 compute-0 openstack_network_exporter[205060]: INFO    13:54:27 main.go:48: registering *pmd_rxq.Collector
Oct 01 13:54:27 compute-0 openstack_network_exporter[205060]: INFO    13:54:27 main.go:48: registering *vswitch.Collector
Oct 01 13:54:27 compute-0 openstack_network_exporter[205060]: NOTICE  13:54:27 main.go:76: listening on https://:9105/metrics
Oct 01 13:54:27 compute-0 podman[205044]: 2025-10-01 13:54:27.960557905 +0000 UTC m=+0.251908317 container start e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, release=1755695350, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, distribution-scope=public, io.buildah.version=1.33.7, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, maintainer=Red Hat, Inc.)
Oct 01 13:54:27 compute-0 podman[205044]: openstack_network_exporter
Oct 01 13:54:27 compute-0 systemd[1]: Started openstack_network_exporter container.
Oct 01 13:54:28 compute-0 sudo[205001]: pam_unix(sudo:session): session closed for user root
Oct 01 13:54:28 compute-0 podman[205070]: 2025-10-01 13:54:28.103782784 +0000 UTC m=+0.121323544 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, vcs-type=git, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, distribution-scope=public, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_id=edpm, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41)
Oct 01 13:54:28 compute-0 sshd-session[204762]: Failed password for root from 80.94.93.176 port 41628 ssh2
Oct 01 13:54:28 compute-0 unix_chkpwd[205240]: password check failed for user (root)
Oct 01 13:54:28 compute-0 sudo[205243]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldjfgmzadbbofdnrdzhulposrrpklpsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326868.2800255-1268-126698654436058/AnsiballZ_systemd.py'
Oct 01 13:54:28 compute-0 sudo[205243]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:54:29 compute-0 python3.9[205245]: ansible-ansible.builtin.systemd Invoked with name=edpm_openstack_network_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct 01 13:54:29 compute-0 systemd[1]: Stopping openstack_network_exporter container...
Oct 01 13:54:29 compute-0 systemd[1]: libpod-e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c.scope: Deactivated successfully.
Oct 01 13:54:29 compute-0 podman[205249]: 2025-10-01 13:54:29.194741482 +0000 UTC m=+0.087811001 container died e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, distribution-scope=public, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, release=1755695350, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct 01 13:54:29 compute-0 systemd[1]: e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c-75a6aa3ff3461c9b.timer: Deactivated successfully.
Oct 01 13:54:29 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c.
Oct 01 13:54:29 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c-userdata-shm.mount: Deactivated successfully.
Oct 01 13:54:29 compute-0 systemd[1]: var-lib-containers-storage-overlay-5197ae8e65e82dd2ddbb6cfeb9e091b08e83dc7402cdc219a04e9695887493b0-merged.mount: Deactivated successfully.
Oct 01 13:54:29 compute-0 podman[205249]: 2025-10-01 13:54:29.938119268 +0000 UTC m=+0.831188817 container cleanup e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, container_name=openstack_network_exporter, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.buildah.version=1.33.7, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, vcs-type=git, managed_by=edpm_ansible, version=9.6)
Oct 01 13:54:29 compute-0 podman[205249]: openstack_network_exporter
Oct 01 13:54:29 compute-0 systemd[1]: edpm_openstack_network_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Oct 01 13:54:30 compute-0 sshd-session[204762]: Failed password for root from 80.94.93.176 port 41628 ssh2
Oct 01 13:54:30 compute-0 podman[205278]: openstack_network_exporter
Oct 01 13:54:30 compute-0 systemd[1]: edpm_openstack_network_exporter.service: Failed with result 'exit-code'.
Oct 01 13:54:30 compute-0 systemd[1]: Stopped openstack_network_exporter container.
Oct 01 13:54:30 compute-0 systemd[1]: Starting openstack_network_exporter container...
Oct 01 13:54:30 compute-0 systemd[1]: Started libcrun container.
Oct 01 13:54:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5197ae8e65e82dd2ddbb6cfeb9e091b08e83dc7402cdc219a04e9695887493b0/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Oct 01 13:54:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5197ae8e65e82dd2ddbb6cfeb9e091b08e83dc7402cdc219a04e9695887493b0/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct 01 13:54:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5197ae8e65e82dd2ddbb6cfeb9e091b08e83dc7402cdc219a04e9695887493b0/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Oct 01 13:54:30 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c.
Oct 01 13:54:30 compute-0 podman[205291]: 2025-10-01 13:54:30.243799639 +0000 UTC m=+0.175885599 container init e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, io.buildah.version=1.33.7, config_id=edpm, version=9.6, release=1755695350, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.openshift.expose-services=)
Oct 01 13:54:30 compute-0 openstack_network_exporter[205307]: INFO    13:54:30 main.go:48: registering *bridge.Collector
Oct 01 13:54:30 compute-0 openstack_network_exporter[205307]: INFO    13:54:30 main.go:48: registering *coverage.Collector
Oct 01 13:54:30 compute-0 openstack_network_exporter[205307]: INFO    13:54:30 main.go:48: registering *datapath.Collector
Oct 01 13:54:30 compute-0 openstack_network_exporter[205307]: INFO    13:54:30 main.go:48: registering *iface.Collector
Oct 01 13:54:30 compute-0 openstack_network_exporter[205307]: INFO    13:54:30 main.go:48: registering *memory.Collector
Oct 01 13:54:30 compute-0 openstack_network_exporter[205307]: INFO    13:54:30 main.go:48: registering *ovnnorthd.Collector
Oct 01 13:54:30 compute-0 openstack_network_exporter[205307]: INFO    13:54:30 main.go:48: registering *ovn.Collector
Oct 01 13:54:30 compute-0 openstack_network_exporter[205307]: INFO    13:54:30 main.go:48: registering *ovsdbserver.Collector
Oct 01 13:54:30 compute-0 openstack_network_exporter[205307]: INFO    13:54:30 main.go:48: registering *pmd_perf.Collector
Oct 01 13:54:30 compute-0 openstack_network_exporter[205307]: INFO    13:54:30 main.go:48: registering *pmd_rxq.Collector
Oct 01 13:54:30 compute-0 openstack_network_exporter[205307]: INFO    13:54:30 main.go:48: registering *vswitch.Collector
Oct 01 13:54:30 compute-0 openstack_network_exporter[205307]: NOTICE  13:54:30 main.go:76: listening on https://:9105/metrics
Oct 01 13:54:30 compute-0 podman[205291]: 2025-10-01 13:54:30.292716661 +0000 UTC m=+0.224802561 container start e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., name=ubi9-minimal, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, distribution-scope=public, version=9.6, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Oct 01 13:54:30 compute-0 podman[205291]: openstack_network_exporter
Oct 01 13:54:30 compute-0 systemd[1]: Started openstack_network_exporter container.
Oct 01 13:54:30 compute-0 sudo[205243]: pam_unix(sudo:session): session closed for user root
Oct 01 13:54:30 compute-0 podman[205317]: 2025-10-01 13:54:30.420038077 +0000 UTC m=+0.111097576 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, architecture=x86_64, vcs-type=git, managed_by=edpm_ansible, name=ubi9-minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct 01 13:54:31 compute-0 sudo[205486]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dksogenstmarmyiukgncbzesvsnenrzs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326870.594085-1284-198410840066446/AnsiballZ_find.py'
Oct 01 13:54:31 compute-0 sudo[205486]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:54:31 compute-0 python3.9[205488]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct 01 13:54:31 compute-0 sudo[205486]: pam_unix(sudo:session): session closed for user root
Oct 01 13:54:31 compute-0 unix_chkpwd[205513]: password check failed for user (root)
Oct 01 13:54:33 compute-0 sshd-session[204762]: Failed password for root from 80.94.93.176 port 41628 ssh2
Oct 01 13:54:34 compute-0 sshd-session[204762]: Received disconnect from 80.94.93.176 port 41628:11:  [preauth]
Oct 01 13:54:34 compute-0 sshd-session[204762]: Disconnected from authenticating user root 80.94.93.176 port 41628 [preauth]
Oct 01 13:54:34 compute-0 sshd-session[204762]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.176  user=root
Oct 01 13:54:36 compute-0 podman[205515]: 2025-10-01 13:54:36.175249622 +0000 UTC m=+0.086689811 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd)
Oct 01 13:54:36 compute-0 podman[205514]: 2025-10-01 13:54:36.195610806 +0000 UTC m=+0.112776881 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=iscsid, io.buildah.version=1.41.4)
Oct 01 13:54:44 compute-0 podman[205553]: 2025-10-01 13:54:44.176760314 +0000 UTC m=+0.084084649 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 01 13:54:53 compute-0 podman[205577]: 2025-10-01 13:54:53.18973 +0000 UTC m=+0.098539274 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4)
Oct 01 13:54:53 compute-0 podman[205578]: 2025-10-01 13:54:53.247253695 +0000 UTC m=+0.148478552 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 01 13:54:53 compute-0 sudo[205745]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xllhpzxdswigezfjyaeruprojkjhygde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326893.4841423-1501-108889376874698/AnsiballZ_podman_container_info.py'
Oct 01 13:54:53 compute-0 sudo[205745]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:54:54 compute-0 python3.9[205747]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Oct 01 13:54:54 compute-0 sudo[205745]: pam_unix(sudo:session): session closed for user root
Oct 01 13:54:54 compute-0 sudo[205911]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oycqjbhqtxmsekeuijhwljlsaxpgqtfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326894.521369-1509-226942431373808/AnsiballZ_podman_container_exec.py'
Oct 01 13:54:54 compute-0 sudo[205911]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:54:55 compute-0 python3.9[205913]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 01 13:54:55 compute-0 systemd[1]: Started libpod-conmon-ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f.scope.
Oct 01 13:54:55 compute-0 podman[205914]: 2025-10-01 13:54:55.253735775 +0000 UTC m=+0.117119159 container exec ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, io.buildah.version=1.41.4)
Oct 01 13:54:55 compute-0 podman[205914]: 2025-10-01 13:54:55.28991301 +0000 UTC m=+0.153296354 container exec_died ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, io.buildah.version=1.41.4)
Oct 01 13:54:55 compute-0 systemd[1]: libpod-conmon-ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f.scope: Deactivated successfully.
Oct 01 13:54:55 compute-0 sudo[205911]: pam_unix(sudo:session): session closed for user root
Oct 01 13:54:55 compute-0 sudo[206096]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvsiuhkuusdaovmkletkmmgjembgzplf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326895.5819957-1517-90395277105194/AnsiballZ_podman_container_exec.py'
Oct 01 13:54:55 compute-0 sudo[206096]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:54:56 compute-0 python3.9[206098]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 01 13:54:56 compute-0 systemd[1]: Started libpod-conmon-ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f.scope.
Oct 01 13:54:56 compute-0 podman[206099]: 2025-10-01 13:54:56.33393541 +0000 UTC m=+0.112944135 container exec ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.4)
Oct 01 13:54:56 compute-0 podman[206099]: 2025-10-01 13:54:56.366804245 +0000 UTC m=+0.145812960 container exec_died ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest)
Oct 01 13:54:56 compute-0 systemd[1]: libpod-conmon-ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f.scope: Deactivated successfully.
Oct 01 13:54:56 compute-0 sudo[206096]: pam_unix(sudo:session): session closed for user root
Oct 01 13:54:57 compute-0 sudo[206282]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zoieuoscfcdlaxepwlefuengiggqvrdl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326896.7194767-1525-274431384716800/AnsiballZ_file.py'
Oct 01 13:54:57 compute-0 sudo[206282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:54:57 compute-0 python3.9[206284]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:54:57 compute-0 sudo[206282]: pam_unix(sudo:session): session closed for user root
Oct 01 13:54:57 compute-0 sudo[206434]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvgoprxsaqapkdrrwxzjruaomejhkdty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326897.6411273-1534-12574664856136/AnsiballZ_podman_container_info.py'
Oct 01 13:54:57 compute-0 sudo[206434]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:54:58 compute-0 python3.9[206436]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Oct 01 13:54:58 compute-0 sudo[206434]: pam_unix(sudo:session): session closed for user root
Oct 01 13:54:58 compute-0 sudo[206599]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-beaxjmytlonnrponxyalrspgfrankszl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326898.5713346-1542-247537028668657/AnsiballZ_podman_container_exec.py'
Oct 01 13:54:58 compute-0 sudo[206599]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:54:59 compute-0 python3.9[206601]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 01 13:54:59 compute-0 systemd[1]: Started libpod-conmon-3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3.scope.
Oct 01 13:54:59 compute-0 podman[206602]: 2025-10-01 13:54:59.272494402 +0000 UTC m=+0.109503262 container exec 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, tcib_managed=true)
Oct 01 13:54:59 compute-0 podman[206602]: 2025-10-01 13:54:59.307808003 +0000 UTC m=+0.144816863 container exec_died 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0)
Oct 01 13:54:59 compute-0 systemd[1]: libpod-conmon-3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3.scope: Deactivated successfully.
Oct 01 13:54:59 compute-0 sudo[206599]: pam_unix(sudo:session): session closed for user root
Oct 01 13:54:59 compute-0 sudo[206784]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwlrsdrmjkkqkamzhqsjeiqznjtnwmbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326899.5020227-1550-156657312922767/AnsiballZ_podman_container_exec.py'
Oct 01 13:54:59 compute-0 sudo[206784]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:55:00 compute-0 python3.9[206786]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 01 13:55:00 compute-0 systemd[1]: Started libpod-conmon-3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3.scope.
Oct 01 13:55:00 compute-0 podman[206787]: 2025-10-01 13:55:00.163712801 +0000 UTC m=+0.104734442 container exec 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, container_name=ovn_metadata_agent)
Oct 01 13:55:00 compute-0 podman[206787]: 2025-10-01 13:55:00.195142117 +0000 UTC m=+0.136163788 container exec_died 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct 01 13:55:00 compute-0 systemd[1]: libpod-conmon-3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3.scope: Deactivated successfully.
Oct 01 13:55:00 compute-0 sudo[206784]: pam_unix(sudo:session): session closed for user root
Oct 01 13:55:00 compute-0 sudo[206984]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upfyxqkruubdvpvalzmffpbxaoumqkxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326900.4528944-1558-87929601152806/AnsiballZ_file.py'
Oct 01 13:55:00 compute-0 sudo[206984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:55:00 compute-0 podman[206941]: 2025-10-01 13:55:00.867804458 +0000 UTC m=+0.082074495 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.openshift.tags=minimal rhel9, vcs-type=git, name=ubi9-minimal, io.buildah.version=1.33.7, version=9.6, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers)
Oct 01 13:55:01 compute-0 python3.9[206990]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:55:01 compute-0 sudo[206984]: pam_unix(sudo:session): session closed for user root
Oct 01 13:55:01 compute-0 sudo[207140]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-akmtwltswbixyymnqgrrcguyfmfxibqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326901.3322766-1567-103140005281281/AnsiballZ_podman_container_info.py'
Oct 01 13:55:01 compute-0 sudo[207140]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:55:01 compute-0 nova_compute[192698]: 2025-10-01 13:55:01.856 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 13:55:01 compute-0 nova_compute[192698]: 2025-10-01 13:55:01.857 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 13:55:01 compute-0 python3.9[207142]: ansible-containers.podman.podman_container_info Invoked with name=['iscsid'] executable=podman
Oct 01 13:55:02 compute-0 sudo[207140]: pam_unix(sudo:session): session closed for user root
Oct 01 13:55:02 compute-0 nova_compute[192698]: 2025-10-01 13:55:02.371 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 13:55:02 compute-0 nova_compute[192698]: 2025-10-01 13:55:02.373 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 13:55:02 compute-0 nova_compute[192698]: 2025-10-01 13:55:02.373 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 13:55:02 compute-0 nova_compute[192698]: 2025-10-01 13:55:02.374 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 13:55:02 compute-0 nova_compute[192698]: 2025-10-01 13:55:02.374 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 13:55:02 compute-0 nova_compute[192698]: 2025-10-01 13:55:02.375 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 13:55:02 compute-0 nova_compute[192698]: 2025-10-01 13:55:02.375 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 01 13:55:02 compute-0 nova_compute[192698]: 2025-10-01 13:55:02.375 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 13:55:02 compute-0 sudo[207305]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbaqnnsmhwjgmisxvwuqyjhvnzrberso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326902.2493007-1575-207545525635663/AnsiballZ_podman_container_exec.py'
Oct 01 13:55:02 compute-0 sudo[207305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:55:02 compute-0 python3.9[207307]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=iscsid detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 01 13:55:02 compute-0 nova_compute[192698]: 2025-10-01 13:55:02.892 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 13:55:02 compute-0 nova_compute[192698]: 2025-10-01 13:55:02.893 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 13:55:02 compute-0 nova_compute[192698]: 2025-10-01 13:55:02.893 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 13:55:02 compute-0 nova_compute[192698]: 2025-10-01 13:55:02.893 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 01 13:55:03 compute-0 nova_compute[192698]: 2025-10-01 13:55:03.121 2 WARNING nova.virt.libvirt.driver [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 13:55:03 compute-0 nova_compute[192698]: 2025-10-01 13:55:03.126 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 13:55:03 compute-0 systemd[1]: Started libpod-conmon-393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9.scope.
Oct 01 13:55:03 compute-0 podman[207308]: 2025-10-01 13:55:03.162803932 +0000 UTC m=+0.320220078 container exec 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team)
Oct 01 13:55:03 compute-0 nova_compute[192698]: 2025-10-01 13:55:03.165 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.039s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 13:55:03 compute-0 nova_compute[192698]: 2025-10-01 13:55:03.166 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6054MB free_disk=73.34341049194336GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 01 13:55:03 compute-0 nova_compute[192698]: 2025-10-01 13:55:03.167 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 13:55:03 compute-0 nova_compute[192698]: 2025-10-01 13:55:03.168 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 13:55:03 compute-0 podman[207329]: 2025-10-01 13:55:03.266666469 +0000 UTC m=+0.071641881 container exec_died 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4)
Oct 01 13:55:03 compute-0 podman[207308]: 2025-10-01 13:55:03.297406926 +0000 UTC m=+0.454823052 container exec_died 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct 01 13:55:03 compute-0 systemd[1]: libpod-conmon-393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9.scope: Deactivated successfully.
Oct 01 13:55:03 compute-0 sudo[207305]: pam_unix(sudo:session): session closed for user root
Oct 01 13:55:04 compute-0 sudo[207492]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bdbitwaxjybtvonigmguaywovpdbdztb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326903.6993535-1583-254782107141923/AnsiballZ_podman_container_exec.py'
Oct 01 13:55:04 compute-0 sudo[207492]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:55:04 compute-0 python3.9[207494]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=iscsid detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 01 13:55:04 compute-0 nova_compute[192698]: 2025-10-01 13:55:04.278 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 01 13:55:04 compute-0 nova_compute[192698]: 2025-10-01 13:55:04.278 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 13:55:03 up 54 min,  0 user,  load average: 0.81, 0.87, 0.71\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 01 13:55:04 compute-0 nova_compute[192698]: 2025-10-01 13:55:04.302 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 13:55:04 compute-0 systemd[1]: Started libpod-conmon-393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9.scope.
Oct 01 13:55:04 compute-0 podman[207495]: 2025-10-01 13:55:04.360404692 +0000 UTC m=+0.090123565 container exec 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.license=GPLv2, container_name=iscsid, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest)
Oct 01 13:55:04 compute-0 podman[207495]: 2025-10-01 13:55:04.399871186 +0000 UTC m=+0.129589979 container exec_died 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, config_id=iscsid, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Oct 01 13:55:04 compute-0 systemd[1]: libpod-conmon-393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9.scope: Deactivated successfully.
Oct 01 13:55:04 compute-0 sudo[207492]: pam_unix(sudo:session): session closed for user root
Oct 01 13:55:04 compute-0 nova_compute[192698]: 2025-10-01 13:55:04.809 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 13:55:05 compute-0 sudo[207676]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrwqxejqrlmfwmoznmnofaungytssffz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326904.664726-1591-23650849731975/AnsiballZ_file.py'
Oct 01 13:55:05 compute-0 sudo[207676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:55:05 compute-0 python3.9[207678]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/iscsid recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:55:05 compute-0 sudo[207676]: pam_unix(sudo:session): session closed for user root
Oct 01 13:55:05 compute-0 nova_compute[192698]: 2025-10-01 13:55:05.325 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 01 13:55:05 compute-0 nova_compute[192698]: 2025-10-01 13:55:05.325 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.158s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 13:55:05 compute-0 sudo[207828]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlsbwjxnmspsngeeqfugtleqwsrhikmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326905.4717886-1600-124791331250694/AnsiballZ_podman_container_info.py'
Oct 01 13:55:05 compute-0 sudo[207828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:55:06 compute-0 python3.9[207830]: ansible-containers.podman.podman_container_info Invoked with name=['multipathd'] executable=podman
Oct 01 13:55:06 compute-0 sudo[207828]: pam_unix(sudo:session): session closed for user root
Oct 01 13:55:06 compute-0 podman[207968]: 2025-10-01 13:55:06.857496057 +0000 UTC m=+0.085711524 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, tcib_build_tag=watcher_latest, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 01 13:55:06 compute-0 podman[207967]: 2025-10-01 13:55:06.858627398 +0000 UTC m=+0.087447332 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.4, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 01 13:55:06 compute-0 sudo[208024]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvsdwzljjeogwfexqogqlcxrtiyosrwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326906.4075518-1608-41947855974785/AnsiballZ_podman_container_exec.py'
Oct 01 13:55:06 compute-0 sudo[208024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:55:07 compute-0 python3.9[208032]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 01 13:55:07 compute-0 systemd[1]: Started libpod-conmon-d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0.scope.
Oct 01 13:55:07 compute-0 podman[208033]: 2025-10-01 13:55:07.171629397 +0000 UTC m=+0.081587441 container exec d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, config_id=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS)
Oct 01 13:55:07 compute-0 podman[208033]: 2025-10-01 13:55:07.208689326 +0000 UTC m=+0.118647340 container exec_died d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Oct 01 13:55:07 compute-0 systemd[1]: libpod-conmon-d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0.scope: Deactivated successfully.
Oct 01 13:55:07 compute-0 sudo[208024]: pam_unix(sudo:session): session closed for user root
Oct 01 13:55:07 compute-0 sudo[208212]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nheocpgnavbflplknyjfnesxhecybcvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326907.4507618-1616-23695364355341/AnsiballZ_podman_container_exec.py'
Oct 01 13:55:07 compute-0 sudo[208212]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:55:07 compute-0 python3.9[208214]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 01 13:55:08 compute-0 systemd[1]: Started libpod-conmon-d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0.scope.
Oct 01 13:55:08 compute-0 podman[208215]: 2025-10-01 13:55:08.118625636 +0000 UTC m=+0.109508502 container exec d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, tcib_build_tag=watcher_latest, container_name=multipathd, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Oct 01 13:55:08 compute-0 podman[208215]: 2025-10-01 13:55:08.15588025 +0000 UTC m=+0.146763116 container exec_died d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct 01 13:55:08 compute-0 systemd[1]: libpod-conmon-d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0.scope: Deactivated successfully.
Oct 01 13:55:08 compute-0 sudo[208212]: pam_unix(sudo:session): session closed for user root
Oct 01 13:55:08 compute-0 sudo[208395]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsqqgsbwyfagsgockcigsjiqhbebcmqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326908.434059-1624-86289353072518/AnsiballZ_file.py'
Oct 01 13:55:08 compute-0 sudo[208395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:55:09 compute-0 python3.9[208397]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/multipathd recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:55:09 compute-0 sudo[208395]: pam_unix(sudo:session): session closed for user root
Oct 01 13:55:09 compute-0 sudo[208547]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkdxqcwkrbhhcmmizuiscqampmyqgzrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326909.2881637-1633-257032873938303/AnsiballZ_podman_container_info.py'
Oct 01 13:55:09 compute-0 sudo[208547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:55:09 compute-0 python3.9[208549]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Oct 01 13:55:09 compute-0 sudo[208547]: pam_unix(sudo:session): session closed for user root
Oct 01 13:55:10 compute-0 sudo[208713]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-paoipgodwnwddziklvsenkjshycruhve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326910.1680615-1641-54549041176589/AnsiballZ_podman_container_exec.py'
Oct 01 13:55:10 compute-0 sudo[208713]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:55:10 compute-0 python3.9[208715]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 01 13:55:10 compute-0 systemd[1]: Started libpod-conmon-a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e.scope.
Oct 01 13:55:10 compute-0 podman[208716]: 2025-10-01 13:55:10.955922661 +0000 UTC m=+0.112491623 container exec a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 01 13:55:10 compute-0 podman[208716]: 2025-10-01 13:55:10.994660336 +0000 UTC m=+0.151229198 container exec_died a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 01 13:55:11 compute-0 systemd[1]: libpod-conmon-a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e.scope: Deactivated successfully.
Oct 01 13:55:11 compute-0 sudo[208713]: pam_unix(sudo:session): session closed for user root
Oct 01 13:55:11 compute-0 sudo[208895]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vtxnureufwangxdzbmveaombwjaulxzq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326911.3319578-1649-100660850327702/AnsiballZ_podman_container_exec.py'
Oct 01 13:55:11 compute-0 sudo[208895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:55:11 compute-0 python3.9[208897]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 01 13:55:12 compute-0 systemd[1]: Started libpod-conmon-a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e.scope.
Oct 01 13:55:12 compute-0 podman[208898]: 2025-10-01 13:55:12.041920254 +0000 UTC m=+0.122578348 container exec a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 01 13:55:12 compute-0 podman[208918]: 2025-10-01 13:55:12.116587017 +0000 UTC m=+0.062207855 container exec_died a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 01 13:55:12 compute-0 podman[208898]: 2025-10-01 13:55:12.17143932 +0000 UTC m=+0.252097374 container exec_died a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 01 13:55:12 compute-0 systemd[1]: libpod-conmon-a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e.scope: Deactivated successfully.
Oct 01 13:55:12 compute-0 sudo[208895]: pam_unix(sudo:session): session closed for user root
Oct 01 13:55:12 compute-0 sudo[209080]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjjhtuwoqwtwcgrkeddgvcoidimvvtzu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326912.4769766-1657-80742481233085/AnsiballZ_file.py'
Oct 01 13:55:12 compute-0 sudo[209080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:55:13 compute-0 python3.9[209082]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:55:13 compute-0 sudo[209080]: pam_unix(sudo:session): session closed for user root
Oct 01 13:55:13 compute-0 sudo[209232]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrelxiqkuaolqldefrrkintwuzrnkqar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326913.3418963-1666-42123493659158/AnsiballZ_podman_container_info.py'
Oct 01 13:55:13 compute-0 sudo[209232]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:55:13 compute-0 python3.9[209234]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Oct 01 13:55:14 compute-0 sudo[209232]: pam_unix(sudo:session): session closed for user root
Oct 01 13:55:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:55:14.213 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 13:55:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:55:14.215 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 13:55:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:55:14.215 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 13:55:14 compute-0 sudo[209410]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-torsojfzkxovqfuifzwmnrpmfsrilgzq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326914.3474214-1674-240224223173584/AnsiballZ_podman_container_exec.py'
Oct 01 13:55:14 compute-0 sudo[209410]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:55:14 compute-0 podman[209372]: 2025-10-01 13:55:14.742427626 +0000 UTC m=+0.094435712 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 01 13:55:14 compute-0 python3.9[209424]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 01 13:55:15 compute-0 systemd[1]: Started libpod-conmon-e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c.scope.
Oct 01 13:55:15 compute-0 podman[209426]: 2025-10-01 13:55:15.109629621 +0000 UTC m=+0.125274831 container exec e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., version=9.6, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., release=1755695350, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41)
Oct 01 13:55:15 compute-0 podman[209446]: 2025-10-01 13:55:15.221549908 +0000 UTC m=+0.090720110 container exec_died e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, name=ubi9-minimal, version=9.6, architecture=x86_64, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.expose-services=, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7)
Oct 01 13:55:15 compute-0 podman[209426]: 2025-10-01 13:55:15.278591311 +0000 UTC m=+0.294236501 container exec_died e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, vcs-type=git, architecture=x86_64, config_id=edpm, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, managed_by=edpm_ansible, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc.)
Oct 01 13:55:15 compute-0 systemd[1]: libpod-conmon-e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c.scope: Deactivated successfully.
Oct 01 13:55:15 compute-0 sudo[209410]: pam_unix(sudo:session): session closed for user root
Oct 01 13:55:16 compute-0 sudo[209609]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qemfpngwrcgnmmlqbaulgfajwsijpfit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326915.7865644-1682-182665763525723/AnsiballZ_podman_container_exec.py'
Oct 01 13:55:16 compute-0 sudo[209609]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:55:16 compute-0 python3.9[209611]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct 01 13:55:16 compute-0 systemd[1]: Started libpod-conmon-e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c.scope.
Oct 01 13:55:16 compute-0 podman[209612]: 2025-10-01 13:55:16.637677637 +0000 UTC m=+0.170242735 container exec e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, release=1755695350, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, config_id=edpm, name=ubi9-minimal, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64)
Oct 01 13:55:16 compute-0 podman[209612]: 2025-10-01 13:55:16.669863273 +0000 UTC m=+0.202428421 container exec_died e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., name=ubi9-minimal, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.buildah.version=1.33.7, config_id=edpm, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc.)
Oct 01 13:55:16 compute-0 systemd[1]: libpod-conmon-e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c.scope: Deactivated successfully.
Oct 01 13:55:16 compute-0 sudo[209609]: pam_unix(sudo:session): session closed for user root
Oct 01 13:55:17 compute-0 sudo[209794]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-suguflpwettaxzwwxrvpnucltbdzmkyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326916.959127-1690-140004677780526/AnsiballZ_file.py'
Oct 01 13:55:17 compute-0 sudo[209794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:55:17 compute-0 python3.9[209796]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:55:17 compute-0 sudo[209794]: pam_unix(sudo:session): session closed for user root
Oct 01 13:55:18 compute-0 sudo[209946]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tilvsnlwqlmfeaflsjrdehhadvwkdcqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326917.9054005-1700-174786778045184/AnsiballZ_file.py'
Oct 01 13:55:18 compute-0 sudo[209946]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:55:18 compute-0 python3.9[209948]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:55:18 compute-0 sudo[209946]: pam_unix(sudo:session): session closed for user root
Oct 01 13:55:19 compute-0 sudo[210098]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukgmpujpworoimcvnlgohegykkhutbqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326918.7393968-1716-175194109483175/AnsiballZ_stat.py'
Oct 01 13:55:19 compute-0 sudo[210098]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:55:19 compute-0 python3.9[210100]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:55:19 compute-0 sudo[210098]: pam_unix(sudo:session): session closed for user root
Oct 01 13:55:19 compute-0 sudo[210221]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qdtbuexkrhsxjcennkgdhshvxlsnfwth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326918.7393968-1716-175194109483175/AnsiballZ_copy.py'
Oct 01 13:55:19 compute-0 sudo[210221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:55:20 compute-0 python3.9[210223]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1759326918.7393968-1716-175194109483175/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:55:20 compute-0 sudo[210221]: pam_unix(sudo:session): session closed for user root
Oct 01 13:55:20 compute-0 sudo[210373]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxafiraorwfuyojjyycwelhvqktlfqij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326920.5116067-1748-56607123654500/AnsiballZ_file.py'
Oct 01 13:55:20 compute-0 sudo[210373]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:55:21 compute-0 python3.9[210375]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:55:21 compute-0 sudo[210373]: pam_unix(sudo:session): session closed for user root
Oct 01 13:55:21 compute-0 sudo[210525]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zpgvssqcdeivvxtjhiqzitjqgfmuatkf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326921.373276-1764-50383987247710/AnsiballZ_stat.py'
Oct 01 13:55:21 compute-0 sudo[210525]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:55:22 compute-0 python3.9[210527]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:55:22 compute-0 sudo[210525]: pam_unix(sudo:session): session closed for user root
Oct 01 13:55:22 compute-0 sudo[210603]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewbaabenxtamlphbhbisswsiasxsepwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326921.373276-1764-50383987247710/AnsiballZ_file.py'
Oct 01 13:55:22 compute-0 sudo[210603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:55:22 compute-0 python3.9[210605]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:55:22 compute-0 sudo[210603]: pam_unix(sudo:session): session closed for user root
Oct 01 13:55:23 compute-0 sudo[210755]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgqdrfsexbtozifaezpnyujayibfgagt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326922.8255126-1788-198484382279474/AnsiballZ_stat.py'
Oct 01 13:55:23 compute-0 sudo[210755]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:55:23 compute-0 podman[210757]: 2025-10-01 13:55:23.376088017 +0000 UTC m=+0.108138225 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4)
Oct 01 13:55:23 compute-0 python3.9[210758]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:55:23 compute-0 sudo[210755]: pam_unix(sudo:session): session closed for user root
Oct 01 13:55:23 compute-0 podman[210778]: 2025-10-01 13:55:23.564849765 +0000 UTC m=+0.145382939 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0)
Oct 01 13:55:23 compute-0 sudo[210880]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-elfllgfrdtxyabkrgwtsexbfbybaligm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326922.8255126-1788-198484382279474/AnsiballZ_file.py'
Oct 01 13:55:23 compute-0 sudo[210880]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:55:24 compute-0 python3.9[210882]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.wlgtjtyt recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:55:24 compute-0 sudo[210880]: pam_unix(sudo:session): session closed for user root
Oct 01 13:55:24 compute-0 sudo[211032]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txpxevjrteehdthuzcmikqpcaruvzkym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326924.35299-1812-87841378138086/AnsiballZ_stat.py'
Oct 01 13:55:24 compute-0 sudo[211032]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:55:24 compute-0 python3.9[211034]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:55:25 compute-0 sudo[211032]: pam_unix(sudo:session): session closed for user root
Oct 01 13:55:25 compute-0 sudo[211110]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ogfnnxcizaxmdtnhlyetvvzndgqoqxtu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326924.35299-1812-87841378138086/AnsiballZ_file.py'
Oct 01 13:55:25 compute-0 sudo[211110]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:55:25 compute-0 python3.9[211112]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:55:25 compute-0 sudo[211110]: pam_unix(sudo:session): session closed for user root
Oct 01 13:55:26 compute-0 sudo[211262]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sikrcvudftzseashtnztkztbmrpcjbbh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326925.898772-1838-12747520851330/AnsiballZ_command.py'
Oct 01 13:55:26 compute-0 sudo[211262]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:55:26 compute-0 python3.9[211264]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:55:26 compute-0 sudo[211262]: pam_unix(sudo:session): session closed for user root
Oct 01 13:55:27 compute-0 sudo[211415]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttzflqguagrriverwftpaninidyyptuk ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759326926.7827094-1854-232100132216821/AnsiballZ_edpm_nftables_from_files.py'
Oct 01 13:55:27 compute-0 sudo[211415]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:55:27 compute-0 python3[211417]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Oct 01 13:55:27 compute-0 sudo[211415]: pam_unix(sudo:session): session closed for user root
Oct 01 13:55:28 compute-0 sudo[211567]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tlhrkxtgijbblezxqowadizsvvkqxyht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326927.76031-1870-157276342639848/AnsiballZ_stat.py'
Oct 01 13:55:28 compute-0 sudo[211567]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:55:28 compute-0 python3.9[211569]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:55:28 compute-0 sudo[211567]: pam_unix(sudo:session): session closed for user root
Oct 01 13:55:28 compute-0 sudo[211645]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubpysiwgtrqbqyerbwjvhggkrwsqwzeg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326927.76031-1870-157276342639848/AnsiballZ_file.py'
Oct 01 13:55:28 compute-0 sudo[211645]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:55:29 compute-0 python3.9[211647]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:55:29 compute-0 sudo[211645]: pam_unix(sudo:session): session closed for user root
Oct 01 13:55:29 compute-0 sudo[211797]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uluudgokbivzpcvjubrhjsuotqgzwoln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326929.2825825-1894-5387685355055/AnsiballZ_stat.py'
Oct 01 13:55:29 compute-0 sudo[211797]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:55:29 compute-0 python3.9[211799]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:55:29 compute-0 sudo[211797]: pam_unix(sudo:session): session closed for user root
Oct 01 13:55:30 compute-0 sudo[211875]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dyzmmjynqiahqbsuyeinorskecbtjetu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326929.2825825-1894-5387685355055/AnsiballZ_file.py'
Oct 01 13:55:30 compute-0 sudo[211875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:55:30 compute-0 python3.9[211877]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:55:30 compute-0 sudo[211875]: pam_unix(sudo:session): session closed for user root
Oct 01 13:55:31 compute-0 sudo[212040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tivgvqbnomuatyyadknhycnaswhmbcdn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326930.6411781-1918-224502597402032/AnsiballZ_stat.py'
Oct 01 13:55:31 compute-0 sudo[212040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:55:31 compute-0 podman[212001]: 2025-10-01 13:55:31.1104604 +0000 UTC m=+0.093382319 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, distribution-scope=public, maintainer=Red Hat, Inc., io.openshift.expose-services=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, vcs-type=git, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_id=edpm, release=1755695350)
Oct 01 13:55:31 compute-0 python3.9[212048]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:55:31 compute-0 sudo[212040]: pam_unix(sudo:session): session closed for user root
Oct 01 13:55:31 compute-0 sudo[212127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-anxgoufgpqfedsimuqrpkwfxbccpmrot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326930.6411781-1918-224502597402032/AnsiballZ_file.py'
Oct 01 13:55:31 compute-0 sudo[212127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:55:31 compute-0 python3.9[212129]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:55:31 compute-0 sudo[212127]: pam_unix(sudo:session): session closed for user root
Oct 01 13:55:32 compute-0 sudo[212279]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfibwieyospzrdydvptjjjqlsxyqhwuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326932.1588895-1942-232751879621694/AnsiballZ_stat.py'
Oct 01 13:55:32 compute-0 sudo[212279]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:55:32 compute-0 python3.9[212281]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:55:32 compute-0 sudo[212279]: pam_unix(sudo:session): session closed for user root
Oct 01 13:55:33 compute-0 sudo[212357]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rojdkdendgbxnfmkvjdxjppxzgmalugd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326932.1588895-1942-232751879621694/AnsiballZ_file.py'
Oct 01 13:55:33 compute-0 sudo[212357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:55:33 compute-0 python3.9[212359]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:55:33 compute-0 sudo[212357]: pam_unix(sudo:session): session closed for user root
Oct 01 13:55:34 compute-0 sudo[212509]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-panrymoghmtttlmilivhsmslhfxxxujr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326933.6389136-1966-78885490543569/AnsiballZ_stat.py'
Oct 01 13:55:34 compute-0 sudo[212509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:55:34 compute-0 python3.9[212511]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct 01 13:55:34 compute-0 sudo[212509]: pam_unix(sudo:session): session closed for user root
Oct 01 13:55:34 compute-0 sudo[212634]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-idlxuglscvoycpuekhlcnishfdgkfimc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326933.6389136-1966-78885490543569/AnsiballZ_copy.py'
Oct 01 13:55:34 compute-0 sudo[212634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:55:34 compute-0 python3.9[212636]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759326933.6389136-1966-78885490543569/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:55:34 compute-0 sudo[212634]: pam_unix(sudo:session): session closed for user root
Oct 01 13:55:35 compute-0 sudo[212786]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmkrsfdzhfzxrqaysaynkpuuywnpxuzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326935.1328158-1996-42662751045592/AnsiballZ_file.py'
Oct 01 13:55:35 compute-0 sudo[212786]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:55:35 compute-0 python3.9[212788]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:55:35 compute-0 sudo[212786]: pam_unix(sudo:session): session closed for user root
Oct 01 13:55:36 compute-0 sudo[212938]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nuvmndbjytxbwikwyjkxxsvifhnxkeqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326936.0437167-2012-41976263140853/AnsiballZ_command.py'
Oct 01 13:55:36 compute-0 sudo[212938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:55:36 compute-0 python3.9[212940]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:55:36 compute-0 sudo[212938]: pam_unix(sudo:session): session closed for user root
Oct 01 13:55:37 compute-0 podman[213021]: 2025-10-01 13:55:37.20182035 +0000 UTC m=+0.112082679 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930)
Oct 01 13:55:37 compute-0 podman[213017]: 2025-10-01 13:55:37.209780307 +0000 UTC m=+0.117327772 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930)
Oct 01 13:55:37 compute-0 sudo[213131]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xggyuaylwotwndnckwicunrsqtdnggwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326936.988209-2028-195961695820405/AnsiballZ_blockinfile.py'
Oct 01 13:55:37 compute-0 sudo[213131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:55:37 compute-0 python3.9[213133]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:55:37 compute-0 sudo[213131]: pam_unix(sudo:session): session closed for user root
Oct 01 13:55:38 compute-0 sudo[213283]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-baavjokeiotrfdxcdumdertaizkgwovz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326938.021666-2046-54135715321005/AnsiballZ_command.py'
Oct 01 13:55:38 compute-0 sudo[213283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:55:38 compute-0 python3.9[213285]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:55:38 compute-0 sudo[213283]: pam_unix(sudo:session): session closed for user root
Oct 01 13:55:39 compute-0 sudo[213436]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgosorgllhhewmflfzcuxmcczuuhybyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326939.0564165-2062-106668579927466/AnsiballZ_stat.py'
Oct 01 13:55:39 compute-0 sudo[213436]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:55:39 compute-0 python3.9[213438]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct 01 13:55:39 compute-0 sudo[213436]: pam_unix(sudo:session): session closed for user root
Oct 01 13:55:40 compute-0 sudo[213590]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdszpvtgbcyprquhwlbahrthuuomwusz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326939.8743865-2078-262571398530761/AnsiballZ_command.py'
Oct 01 13:55:40 compute-0 sudo[213590]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:55:40 compute-0 python3.9[213592]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct 01 13:55:40 compute-0 sudo[213590]: pam_unix(sudo:session): session closed for user root
Oct 01 13:55:41 compute-0 sudo[213745]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktdkabzfrnnxksocuksyyjudsyusbgts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759326940.6645043-2094-113580184173920/AnsiballZ_file.py'
Oct 01 13:55:41 compute-0 sudo[213745]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 13:55:41 compute-0 python3.9[213747]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct 01 13:55:41 compute-0 sudo[213745]: pam_unix(sudo:session): session closed for user root
Oct 01 13:55:41 compute-0 sshd-session[193043]: Connection closed by 192.168.122.30 port 47624
Oct 01 13:55:41 compute-0 sshd-session[193019]: pam_unix(sshd:session): session closed for user zuul
Oct 01 13:55:41 compute-0 systemd[1]: session-28.scope: Deactivated successfully.
Oct 01 13:55:41 compute-0 systemd[1]: session-28.scope: Consumed 1min 47.984s CPU time.
Oct 01 13:55:41 compute-0 systemd-logind[791]: Session 28 logged out. Waiting for processes to exit.
Oct 01 13:55:41 compute-0 systemd-logind[791]: Removed session 28.
Oct 01 13:55:45 compute-0 podman[213772]: 2025-10-01 13:55:45.186726984 +0000 UTC m=+0.095296001 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 01 13:55:54 compute-0 podman[213796]: 2025-10-01 13:55:54.179070003 +0000 UTC m=+0.086324637 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Oct 01 13:55:54 compute-0 podman[213797]: 2025-10-01 13:55:54.224827971 +0000 UTC m=+0.127801948 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct 01 13:55:59 compute-0 podman[203144]: time="2025-10-01T13:55:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 13:55:59 compute-0 podman[203144]: @ - - [01/Oct/2025:13:55:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 13:55:59 compute-0 podman[203144]: @ - - [01/Oct/2025:13:55:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2979 "" "Go-http-client/1.1"
Oct 01 13:56:01 compute-0 openstack_network_exporter[205307]: ERROR   13:56:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 13:56:01 compute-0 openstack_network_exporter[205307]: ERROR   13:56:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 13:56:01 compute-0 openstack_network_exporter[205307]: ERROR   13:56:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 13:56:01 compute-0 openstack_network_exporter[205307]: ERROR   13:56:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 13:56:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 13:56:01 compute-0 openstack_network_exporter[205307]: ERROR   13:56:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 13:56:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 13:56:02 compute-0 podman[213844]: 2025-10-01 13:56:02.154655193 +0000 UTC m=+0.067113312 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, maintainer=Red Hat, Inc., release=1755695350, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.buildah.version=1.33.7, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git)
Oct 01 13:56:05 compute-0 nova_compute[192698]: 2025-10-01 13:56:05.327 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 13:56:05 compute-0 nova_compute[192698]: 2025-10-01 13:56:05.327 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 13:56:05 compute-0 nova_compute[192698]: 2025-10-01 13:56:05.328 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 13:56:05 compute-0 nova_compute[192698]: 2025-10-01 13:56:05.328 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 13:56:05 compute-0 nova_compute[192698]: 2025-10-01 13:56:05.329 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 13:56:05 compute-0 nova_compute[192698]: 2025-10-01 13:56:05.329 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 13:56:05 compute-0 nova_compute[192698]: 2025-10-01 13:56:05.329 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 13:56:05 compute-0 nova_compute[192698]: 2025-10-01 13:56:05.330 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 01 13:56:05 compute-0 nova_compute[192698]: 2025-10-01 13:56:05.330 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 13:56:05 compute-0 nova_compute[192698]: 2025-10-01 13:56:05.845 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 13:56:05 compute-0 nova_compute[192698]: 2025-10-01 13:56:05.846 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 13:56:05 compute-0 nova_compute[192698]: 2025-10-01 13:56:05.846 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 13:56:05 compute-0 nova_compute[192698]: 2025-10-01 13:56:05.847 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 01 13:56:06 compute-0 nova_compute[192698]: 2025-10-01 13:56:06.089 2 WARNING nova.virt.libvirt.driver [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 13:56:06 compute-0 nova_compute[192698]: 2025-10-01 13:56:06.091 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 13:56:06 compute-0 nova_compute[192698]: 2025-10-01 13:56:06.110 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.020s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 13:56:06 compute-0 nova_compute[192698]: 2025-10-01 13:56:06.112 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6120MB free_disk=73.341796875GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 01 13:56:06 compute-0 nova_compute[192698]: 2025-10-01 13:56:06.112 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 13:56:06 compute-0 nova_compute[192698]: 2025-10-01 13:56:06.113 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 13:56:07 compute-0 nova_compute[192698]: 2025-10-01 13:56:07.200 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 01 13:56:07 compute-0 nova_compute[192698]: 2025-10-01 13:56:07.201 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 13:56:06 up 55 min,  0 user,  load average: 0.66, 0.83, 0.71\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 01 13:56:07 compute-0 nova_compute[192698]: 2025-10-01 13:56:07.221 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 13:56:07 compute-0 nova_compute[192698]: 2025-10-01 13:56:07.731 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 13:56:08 compute-0 podman[213866]: 2025-10-01 13:56:08.155123944 +0000 UTC m=+0.069108666 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, container_name=iscsid, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, config_id=iscsid, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct 01 13:56:08 compute-0 podman[213867]: 2025-10-01 13:56:08.170259627 +0000 UTC m=+0.072550370 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, config_id=multipathd, org.label-schema.license=GPLv2, tcib_managed=true)
Oct 01 13:56:08 compute-0 nova_compute[192698]: 2025-10-01 13:56:08.240 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 01 13:56:08 compute-0 nova_compute[192698]: 2025-10-01 13:56:08.240 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.128s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 13:56:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:56:14.217 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 13:56:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:56:14.217 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 13:56:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:56:14.217 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 13:56:16 compute-0 podman[213906]: 2025-10-01 13:56:16.177820009 +0000 UTC m=+0.084799945 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 01 13:56:25 compute-0 podman[213930]: 2025-10-01 13:56:25.16909038 +0000 UTC m=+0.077240659 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team)
Oct 01 13:56:25 compute-0 podman[213931]: 2025-10-01 13:56:25.248768864 +0000 UTC m=+0.144281748 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Oct 01 13:56:26 compute-0 PackageKit[130220]: daemon quit
Oct 01 13:56:26 compute-0 systemd[1]: packagekit.service: Deactivated successfully.
Oct 01 13:56:33 compute-0 podman[213976]: 2025-10-01 13:56:33.184072213 +0000 UTC m=+0.089619496 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, version=9.6, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, config_id=edpm, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git)
Oct 01 13:56:39 compute-0 podman[213998]: 2025-10-01 13:56:39.154997608 +0000 UTC m=+0.071859065 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, container_name=iscsid, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930)
Oct 01 13:56:39 compute-0 podman[213999]: 2025-10-01 13:56:39.186494401 +0000 UTC m=+0.090294845 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 01 13:56:47 compute-0 podman[214039]: 2025-10-01 13:56:47.176817263 +0000 UTC m=+0.082148455 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 01 13:56:54 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:56:54.193 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'e2:3f:3c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:1d:a6:67:ed:e6'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 13:56:54 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:56:54.194 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 01 13:56:54 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:56:54.197 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=10cf9814-09fa-4bad-879a-270f9b64eda3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 13:56:56 compute-0 podman[214064]: 2025-10-01 13:56:56.182056883 +0000 UTC m=+0.081908238 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 01 13:56:56 compute-0 podman[214065]: 2025-10-01 13:56:56.238966893 +0000 UTC m=+0.135542459 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 01 13:56:58 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:56:58.554 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d9:98:7a 192.168.122.171'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.122.171/24', 'neutron:device_id': 'ovnmeta-572854dd-8d19-426e-9b3c-db3144728b53', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-572854dd-8d19-426e-9b3c-db3144728b53', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9dacac6049d34f02846f752af09ae16f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a839012a-64f3-46e8-b84d-f9980ba6a4db, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=6f849a88-f411-475e-a9a3-649551b3ecb9) old=Port_Binding(mac=['fa:16:3e:d9:98:7a'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-572854dd-8d19-426e-9b3c-db3144728b53', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-572854dd-8d19-426e-9b3c-db3144728b53', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9dacac6049d34f02846f752af09ae16f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 13:56:58 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:56:58.556 103791 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 6f849a88-f411-475e-a9a3-649551b3ecb9 in datapath 572854dd-8d19-426e-9b3c-db3144728b53 updated
Oct 01 13:56:58 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:56:58.558 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 572854dd-8d19-426e-9b3c-db3144728b53, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 01 13:56:58 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:56:58.559 103791 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmp_vr73yju/privsep.sock']
Oct 01 13:56:59 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:56:59.354 103791 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Oct 01 13:56:59 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:56:59.355 103791 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp_vr73yju/privsep.sock __init__ /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:377
Oct 01 13:56:59 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:56:59.158 214114 INFO oslo.privsep.daemon [-] privsep daemon starting
Oct 01 13:56:59 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:56:59.164 214114 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Oct 01 13:56:59 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:56:59.166 214114 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Oct 01 13:56:59 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:56:59.166 214114 INFO oslo.privsep.daemon [-] privsep daemon running as pid 214114
Oct 01 13:56:59 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:56:59.357 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[8bbae945-cef0-4cd1-9566-ffe7358042ed]: (2,) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 13:56:59 compute-0 podman[203144]: time="2025-10-01T13:56:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 13:56:59 compute-0 podman[203144]: @ - - [01/Oct/2025:13:56:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 13:56:59 compute-0 podman[203144]: @ - - [01/Oct/2025:13:56:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2982 "" "Go-http-client/1.1"
Oct 01 13:56:59 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:56:59.883 214114 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 13:56:59 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:56:59.883 214114 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 13:56:59 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:56:59.884 214114 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 13:57:00 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:57:00.440 214114 INFO oslo_service.backend [-] Loading backend: eventlet
Oct 01 13:57:00 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:57:00.447 214114 INFO oslo_service.backend [-] Backend 'eventlet' successfully loaded and cached.
Oct 01 13:57:00 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:57:00.491 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[00615306-458c-4432-9f78-37d6d76faced]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 13:57:01 compute-0 openstack_network_exporter[205307]: ERROR   13:57:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 13:57:01 compute-0 openstack_network_exporter[205307]: ERROR   13:57:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 13:57:01 compute-0 openstack_network_exporter[205307]: ERROR   13:57:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 13:57:01 compute-0 openstack_network_exporter[205307]: ERROR   13:57:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 13:57:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 13:57:01 compute-0 openstack_network_exporter[205307]: ERROR   13:57:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 13:57:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 13:57:04 compute-0 podman[214124]: 2025-10-01 13:57:04.172238342 +0000 UTC m=+0.086786560 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-type=git, com.redhat.component=ubi9-minimal-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, distribution-scope=public)
Oct 01 13:57:06 compute-0 nova_compute[192698]: 2025-10-01 13:57:06.827 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 13:57:06 compute-0 nova_compute[192698]: 2025-10-01 13:57:06.827 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 13:57:07 compute-0 nova_compute[192698]: 2025-10-01 13:57:07.337 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 13:57:07 compute-0 nova_compute[192698]: 2025-10-01 13:57:07.338 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 13:57:07 compute-0 nova_compute[192698]: 2025-10-01 13:57:07.338 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 13:57:07 compute-0 nova_compute[192698]: 2025-10-01 13:57:07.339 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 13:57:07 compute-0 nova_compute[192698]: 2025-10-01 13:57:07.339 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 13:57:07 compute-0 nova_compute[192698]: 2025-10-01 13:57:07.340 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 13:57:07 compute-0 nova_compute[192698]: 2025-10-01 13:57:07.340 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 01 13:57:07 compute-0 nova_compute[192698]: 2025-10-01 13:57:07.341 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 13:57:07 compute-0 nova_compute[192698]: 2025-10-01 13:57:07.854 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 13:57:07 compute-0 nova_compute[192698]: 2025-10-01 13:57:07.855 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 13:57:07 compute-0 nova_compute[192698]: 2025-10-01 13:57:07.856 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 13:57:07 compute-0 nova_compute[192698]: 2025-10-01 13:57:07.856 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 01 13:57:08 compute-0 nova_compute[192698]: 2025-10-01 13:57:08.035 2 WARNING nova.virt.libvirt.driver [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 13:57:08 compute-0 nova_compute[192698]: 2025-10-01 13:57:08.036 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 13:57:08 compute-0 nova_compute[192698]: 2025-10-01 13:57:08.051 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.015s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 13:57:08 compute-0 nova_compute[192698]: 2025-10-01 13:57:08.052 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6074MB free_disk=73.341796875GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 01 13:57:08 compute-0 nova_compute[192698]: 2025-10-01 13:57:08.052 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 13:57:08 compute-0 nova_compute[192698]: 2025-10-01 13:57:08.052 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 13:57:09 compute-0 nova_compute[192698]: 2025-10-01 13:57:09.101 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 01 13:57:09 compute-0 nova_compute[192698]: 2025-10-01 13:57:09.101 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 13:57:08 up 56 min,  0 user,  load average: 0.40, 0.72, 0.68\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 01 13:57:09 compute-0 nova_compute[192698]: 2025-10-01 13:57:09.124 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 13:57:09 compute-0 nova_compute[192698]: 2025-10-01 13:57:09.630 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 13:57:10 compute-0 nova_compute[192698]: 2025-10-01 13:57:10.143 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 01 13:57:10 compute-0 nova_compute[192698]: 2025-10-01 13:57:10.144 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.091s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 13:57:10 compute-0 podman[214148]: 2025-10-01 13:57:10.153152926 +0000 UTC m=+0.066634384 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4)
Oct 01 13:57:10 compute-0 podman[214147]: 2025-10-01 13:57:10.177270579 +0000 UTC m=+0.094868659 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=iscsid, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible)
Oct 01 13:57:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:57:14.219 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 13:57:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:57:14.219 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 13:57:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:57:14.219 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 13:57:18 compute-0 podman[214188]: 2025-10-01 13:57:18.177805338 +0000 UTC m=+0.087021346 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 01 13:57:27 compute-0 podman[214213]: 2025-10-01 13:57:27.191667759 +0000 UTC m=+0.084910449 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, io.buildah.version=1.41.4)
Oct 01 13:57:27 compute-0 podman[214214]: 2025-10-01 13:57:27.243566793 +0000 UTC m=+0.131547901 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Oct 01 13:57:29 compute-0 podman[203144]: time="2025-10-01T13:57:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 13:57:29 compute-0 podman[203144]: @ - - [01/Oct/2025:13:57:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 13:57:29 compute-0 podman[203144]: @ - - [01/Oct/2025:13:57:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2981 "" "Go-http-client/1.1"
Oct 01 13:57:31 compute-0 openstack_network_exporter[205307]: ERROR   13:57:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 13:57:31 compute-0 openstack_network_exporter[205307]: ERROR   13:57:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 13:57:31 compute-0 openstack_network_exporter[205307]: ERROR   13:57:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 13:57:31 compute-0 openstack_network_exporter[205307]: ERROR   13:57:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 13:57:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 13:57:31 compute-0 openstack_network_exporter[205307]: ERROR   13:57:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 13:57:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 13:57:35 compute-0 podman[214258]: 2025-10-01 13:57:35.153280785 +0000 UTC m=+0.070806067 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, managed_by=edpm_ansible, architecture=x86_64, version=9.6, release=1755695350, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, distribution-scope=public)
Oct 01 13:57:41 compute-0 podman[214281]: 2025-10-01 13:57:41.180571464 +0000 UTC m=+0.093279285 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_id=iscsid, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930)
Oct 01 13:57:41 compute-0 podman[214282]: 2025-10-01 13:57:41.188805917 +0000 UTC m=+0.094146339 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd)
Oct 01 13:57:49 compute-0 podman[214323]: 2025-10-01 13:57:49.16544825 +0000 UTC m=+0.078667129 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 01 13:57:58 compute-0 podman[214348]: 2025-10-01 13:57:58.201948134 +0000 UTC m=+0.111590040 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 01 13:57:58 compute-0 podman[214349]: 2025-10-01 13:57:58.21728878 +0000 UTC m=+0.118963921 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 01 13:57:58 compute-0 nova_compute[192698]: 2025-10-01 13:57:58.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 13:57:58 compute-0 nova_compute[192698]: 2025-10-01 13:57:58.926 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Oct 01 13:57:59 compute-0 nova_compute[192698]: 2025-10-01 13:57:59.456 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Oct 01 13:57:59 compute-0 nova_compute[192698]: 2025-10-01 13:57:59.457 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 13:57:59 compute-0 nova_compute[192698]: 2025-10-01 13:57:59.458 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Oct 01 13:57:59 compute-0 podman[203144]: time="2025-10-01T13:57:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 13:57:59 compute-0 podman[203144]: @ - - [01/Oct/2025:13:57:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 13:57:59 compute-0 podman[203144]: @ - - [01/Oct/2025:13:57:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2978 "" "Go-http-client/1.1"
Oct 01 13:57:59 compute-0 nova_compute[192698]: 2025-10-01 13:57:59.964 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 13:58:01 compute-0 openstack_network_exporter[205307]: ERROR   13:58:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 13:58:01 compute-0 openstack_network_exporter[205307]: ERROR   13:58:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 13:58:01 compute-0 openstack_network_exporter[205307]: ERROR   13:58:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 13:58:01 compute-0 openstack_network_exporter[205307]: ERROR   13:58:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 13:58:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 13:58:01 compute-0 openstack_network_exporter[205307]: ERROR   13:58:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 13:58:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 13:58:02 compute-0 sshd-session[214393]: Invalid user kevin from 185.156.73.233 port 46034
Oct 01 13:58:02 compute-0 sshd-session[214393]: pam_unix(sshd:auth): check pass; user unknown
Oct 01 13:58:02 compute-0 sshd-session[214393]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=185.156.73.233
Oct 01 13:58:03 compute-0 nova_compute[192698]: 2025-10-01 13:58:03.472 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 13:58:03 compute-0 nova_compute[192698]: 2025-10-01 13:58:03.473 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 13:58:03 compute-0 nova_compute[192698]: 2025-10-01 13:58:03.473 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 13:58:03 compute-0 nova_compute[192698]: 2025-10-01 13:58:03.473 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 13:58:03 compute-0 nova_compute[192698]: 2025-10-01 13:58:03.473 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 13:58:03 compute-0 nova_compute[192698]: 2025-10-01 13:58:03.927 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 13:58:04 compute-0 nova_compute[192698]: 2025-10-01 13:58:04.443 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 13:58:04 compute-0 nova_compute[192698]: 2025-10-01 13:58:04.444 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 13:58:04 compute-0 nova_compute[192698]: 2025-10-01 13:58:04.444 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 13:58:04 compute-0 nova_compute[192698]: 2025-10-01 13:58:04.444 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 01 13:58:04 compute-0 nova_compute[192698]: 2025-10-01 13:58:04.688 2 WARNING nova.virt.libvirt.driver [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 13:58:04 compute-0 nova_compute[192698]: 2025-10-01 13:58:04.690 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 13:58:04 compute-0 nova_compute[192698]: 2025-10-01 13:58:04.721 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.031s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 13:58:04 compute-0 nova_compute[192698]: 2025-10-01 13:58:04.722 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6092MB free_disk=73.3419418334961GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 01 13:58:04 compute-0 nova_compute[192698]: 2025-10-01 13:58:04.723 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 13:58:04 compute-0 nova_compute[192698]: 2025-10-01 13:58:04.723 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 13:58:04 compute-0 sshd-session[214393]: Failed password for invalid user kevin from 185.156.73.233 port 46034 ssh2
Oct 01 13:58:05 compute-0 sshd-session[214393]: Connection closed by invalid user kevin 185.156.73.233 port 46034 [preauth]
Oct 01 13:58:05 compute-0 nova_compute[192698]: 2025-10-01 13:58:05.771 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 01 13:58:05 compute-0 nova_compute[192698]: 2025-10-01 13:58:05.772 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 13:58:04 up 57 min,  0 user,  load average: 0.39, 0.69, 0.67\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 01 13:58:05 compute-0 nova_compute[192698]: 2025-10-01 13:58:05.798 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 13:58:06 compute-0 podman[214396]: 2025-10-01 13:58:06.196224263 +0000 UTC m=+0.103072249 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., config_id=edpm, managed_by=edpm_ansible, version=9.6)
Oct 01 13:58:06 compute-0 nova_compute[192698]: 2025-10-01 13:58:06.308 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 13:58:06 compute-0 nova_compute[192698]: 2025-10-01 13:58:06.823 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 01 13:58:06 compute-0 nova_compute[192698]: 2025-10-01 13:58:06.824 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.101s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 13:58:07 compute-0 nova_compute[192698]: 2025-10-01 13:58:07.812 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 13:58:07 compute-0 nova_compute[192698]: 2025-10-01 13:58:07.813 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 13:58:07 compute-0 nova_compute[192698]: 2025-10-01 13:58:07.813 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 01 13:58:12 compute-0 podman[214419]: 2025-10-01 13:58:12.196466383 +0000 UTC m=+0.100651555 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd)
Oct 01 13:58:12 compute-0 podman[214418]: 2025-10-01 13:58:12.222076666 +0000 UTC m=+0.131555321 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, config_id=iscsid, container_name=iscsid)
Oct 01 13:58:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:58:14.221 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 13:58:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:58:14.222 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 13:58:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:58:14.222 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 13:58:20 compute-0 podman[214456]: 2025-10-01 13:58:20.154997005 +0000 UTC m=+0.065752050 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 01 13:58:29 compute-0 podman[214480]: 2025-10-01 13:58:29.189715951 +0000 UTC m=+0.097560571 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20250930, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 01 13:58:29 compute-0 podman[214481]: 2025-10-01 13:58:29.245649504 +0000 UTC m=+0.146422863 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 01 13:58:29 compute-0 podman[203144]: time="2025-10-01T13:58:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 13:58:29 compute-0 podman[203144]: @ - - [01/Oct/2025:13:58:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 13:58:29 compute-0 podman[203144]: @ - - [01/Oct/2025:13:58:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2983 "" "Go-http-client/1.1"
Oct 01 13:58:31 compute-0 openstack_network_exporter[205307]: ERROR   13:58:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 13:58:31 compute-0 openstack_network_exporter[205307]: ERROR   13:58:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 13:58:31 compute-0 openstack_network_exporter[205307]: ERROR   13:58:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 13:58:31 compute-0 openstack_network_exporter[205307]: ERROR   13:58:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 13:58:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 13:58:31 compute-0 openstack_network_exporter[205307]: ERROR   13:58:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 13:58:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 13:58:37 compute-0 podman[214523]: 2025-10-01 13:58:37.182004777 +0000 UTC m=+0.090349736 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, config_id=edpm, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41)
Oct 01 13:58:43 compute-0 podman[214545]: 2025-10-01 13:58:43.191557717 +0000 UTC m=+0.097022457 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, tcib_managed=true, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct 01 13:58:43 compute-0 podman[214546]: 2025-10-01 13:58:43.196812339 +0000 UTC m=+0.098994660 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 01 13:58:51 compute-0 podman[214582]: 2025-10-01 13:58:51.20463805 +0000 UTC m=+0.111761968 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 01 13:58:59 compute-0 podman[203144]: time="2025-10-01T13:58:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 13:58:59 compute-0 podman[203144]: @ - - [01/Oct/2025:13:58:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 13:58:59 compute-0 podman[203144]: @ - - [01/Oct/2025:13:58:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2980 "" "Go-http-client/1.1"
Oct 01 13:59:00 compute-0 podman[214606]: 2025-10-01 13:59:00.169639751 +0000 UTC m=+0.081115651 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Oct 01 13:59:00 compute-0 podman[214607]: 2025-10-01 13:59:00.221571823 +0000 UTC m=+0.133053633 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_managed=true)
Oct 01 13:59:01 compute-0 openstack_network_exporter[205307]: ERROR   13:59:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 13:59:01 compute-0 openstack_network_exporter[205307]: ERROR   13:59:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 13:59:01 compute-0 openstack_network_exporter[205307]: ERROR   13:59:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 13:59:01 compute-0 openstack_network_exporter[205307]: ERROR   13:59:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 13:59:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 13:59:01 compute-0 openstack_network_exporter[205307]: ERROR   13:59:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 13:59:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 13:59:02 compute-0 nova_compute[192698]: 2025-10-01 13:59:02.926 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 13:59:02 compute-0 nova_compute[192698]: 2025-10-01 13:59:02.927 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 13:59:03 compute-0 nova_compute[192698]: 2025-10-01 13:59:03.924 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 13:59:03 compute-0 nova_compute[192698]: 2025-10-01 13:59:03.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 13:59:03 compute-0 nova_compute[192698]: 2025-10-01 13:59:03.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 13:59:03 compute-0 nova_compute[192698]: 2025-10-01 13:59:03.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 13:59:04 compute-0 nova_compute[192698]: 2025-10-01 13:59:04.440 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 13:59:04 compute-0 nova_compute[192698]: 2025-10-01 13:59:04.440 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 13:59:04 compute-0 nova_compute[192698]: 2025-10-01 13:59:04.440 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 13:59:04 compute-0 nova_compute[192698]: 2025-10-01 13:59:04.440 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 01 13:59:04 compute-0 nova_compute[192698]: 2025-10-01 13:59:04.623 2 WARNING nova.virt.libvirt.driver [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 13:59:04 compute-0 nova_compute[192698]: 2025-10-01 13:59:04.624 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 13:59:04 compute-0 nova_compute[192698]: 2025-10-01 13:59:04.658 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.034s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 13:59:04 compute-0 nova_compute[192698]: 2025-10-01 13:59:04.659 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6084MB free_disk=73.34098434448242GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 01 13:59:04 compute-0 nova_compute[192698]: 2025-10-01 13:59:04.659 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 13:59:04 compute-0 nova_compute[192698]: 2025-10-01 13:59:04.659 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 13:59:05 compute-0 nova_compute[192698]: 2025-10-01 13:59:05.720 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 01 13:59:05 compute-0 nova_compute[192698]: 2025-10-01 13:59:05.720 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 13:59:04 up 58 min,  0 user,  load average: 0.58, 0.66, 0.65\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 01 13:59:05 compute-0 nova_compute[192698]: 2025-10-01 13:59:05.772 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Refreshing inventories for resource provider ee1e54f5-453b-4949-a499-9a192f03b8f0 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Oct 01 13:59:05 compute-0 nova_compute[192698]: 2025-10-01 13:59:05.821 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Updating ProviderTree inventory for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Oct 01 13:59:05 compute-0 nova_compute[192698]: 2025-10-01 13:59:05.822 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Updating inventory in ProviderTree for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Oct 01 13:59:05 compute-0 nova_compute[192698]: 2025-10-01 13:59:05.856 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Refreshing aggregate associations for resource provider ee1e54f5-453b-4949-a499-9a192f03b8f0, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Oct 01 13:59:05 compute-0 nova_compute[192698]: 2025-10-01 13:59:05.875 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Refreshing trait associations for resource provider ee1e54f5-453b-4949-a499-9a192f03b8f0, traits: COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_TPM_TIS,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_BMI2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_AVX2,COMPUTE_ARCH_X86_64,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SHA,COMPUTE_SOUND_MODEL_AC97,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_SOUND_MODEL_ES1370,HW_ARCH_X86_64,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_DEVICE_TAGGING,COMPUTE_SECURITY_TPM_CRB,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SOUND_MODEL_SB16,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SOUND_MODEL_USB,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_ACCELERATORS,HW_CPU_X86_CLMUL,HW_CPU_X86_AESNI,COMPUTE_NODE,HW_CPU_X86_SSSE3,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_RESCUE_BFV,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_FMA3,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_AVX,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_F16C,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_ABM,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_VIRTIO_FS,HW_CPU_X86_SSE2,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE4A,HW_CPU_X86_SVM _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Oct 01 13:59:05 compute-0 nova_compute[192698]: 2025-10-01 13:59:05.900 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 13:59:06 compute-0 nova_compute[192698]: 2025-10-01 13:59:06.409 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 13:59:06 compute-0 nova_compute[192698]: 2025-10-01 13:59:06.923 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 01 13:59:06 compute-0 nova_compute[192698]: 2025-10-01 13:59:06.923 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.264s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 13:59:07 compute-0 nova_compute[192698]: 2025-10-01 13:59:07.913 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 13:59:07 compute-0 nova_compute[192698]: 2025-10-01 13:59:07.913 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 13:59:07 compute-0 nova_compute[192698]: 2025-10-01 13:59:07.914 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 01 13:59:08 compute-0 podman[214656]: 2025-10-01 13:59:08.142422814 +0000 UTC m=+0.063541547 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vendor=Red Hat, Inc., io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct 01 13:59:08 compute-0 nova_compute[192698]: 2025-10-01 13:59:08.914 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 13:59:14 compute-0 podman[214677]: 2025-10-01 13:59:14.181944806 +0000 UTC m=+0.095598302 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=iscsid, container_name=iscsid)
Oct 01 13:59:14 compute-0 podman[214678]: 2025-10-01 13:59:14.182823079 +0000 UTC m=+0.084043179 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct 01 13:59:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:59:14.223 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 13:59:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:59:14.224 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 13:59:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 13:59:14.224 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 13:59:22 compute-0 podman[214719]: 2025-10-01 13:59:22.191269135 +0000 UTC m=+0.096374074 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 01 13:59:29 compute-0 podman[203144]: time="2025-10-01T13:59:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 13:59:29 compute-0 podman[203144]: @ - - [01/Oct/2025:13:59:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 13:59:29 compute-0 podman[203144]: @ - - [01/Oct/2025:13:59:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2984 "" "Go-http-client/1.1"
Oct 01 13:59:31 compute-0 podman[214745]: 2025-10-01 13:59:31.179837293 +0000 UTC m=+0.086932288 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team)
Oct 01 13:59:31 compute-0 podman[214746]: 2025-10-01 13:59:31.259223877 +0000 UTC m=+0.162442797 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 01 13:59:31 compute-0 openstack_network_exporter[205307]: ERROR   13:59:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 13:59:31 compute-0 openstack_network_exporter[205307]: ERROR   13:59:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 13:59:31 compute-0 openstack_network_exporter[205307]: ERROR   13:59:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 13:59:31 compute-0 openstack_network_exporter[205307]: ERROR   13:59:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 13:59:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 13:59:31 compute-0 openstack_network_exporter[205307]: ERROR   13:59:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 13:59:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 13:59:39 compute-0 podman[214790]: 2025-10-01 13:59:39.167933398 +0000 UTC m=+0.086352602 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., version=9.6, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, container_name=openstack_network_exporter, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.buildah.version=1.33.7)
Oct 01 13:59:45 compute-0 podman[214812]: 2025-10-01 13:59:45.169524715 +0000 UTC m=+0.077618407 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_id=iscsid, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, container_name=iscsid, managed_by=edpm_ansible)
Oct 01 13:59:45 compute-0 podman[214813]: 2025-10-01 13:59:45.212047693 +0000 UTC m=+0.106091655 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 01 13:59:53 compute-0 podman[214852]: 2025-10-01 13:59:53.17317364 +0000 UTC m=+0.082825678 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 01 13:59:59 compute-0 podman[203144]: time="2025-10-01T13:59:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 13:59:59 compute-0 podman[203144]: @ - - [01/Oct/2025:13:59:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 13:59:59 compute-0 podman[203144]: @ - - [01/Oct/2025:13:59:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2979 "" "Go-http-client/1.1"
Oct 01 14:00:01 compute-0 openstack_network_exporter[205307]: ERROR   14:00:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:00:01 compute-0 openstack_network_exporter[205307]: ERROR   14:00:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:00:01 compute-0 openstack_network_exporter[205307]: ERROR   14:00:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:00:01 compute-0 openstack_network_exporter[205307]: ERROR   14:00:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:00:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:00:01 compute-0 openstack_network_exporter[205307]: ERROR   14:00:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:00:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:00:02 compute-0 podman[214876]: 2025-10-01 14:00:02.178719175 +0000 UTC m=+0.089573360 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 01 14:00:02 compute-0 podman[214877]: 2025-10-01 14:00:02.237099511 +0000 UTC m=+0.143107275 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Oct 01 14:00:02 compute-0 nova_compute[192698]: 2025-10-01 14:00:02.924 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:00:02 compute-0 nova_compute[192698]: 2025-10-01 14:00:02.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:00:03 compute-0 nova_compute[192698]: 2025-10-01 14:00:03.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:00:03 compute-0 nova_compute[192698]: 2025-10-01 14:00:03.926 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:00:03 compute-0 nova_compute[192698]: 2025-10-01 14:00:03.926 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:00:04 compute-0 nova_compute[192698]: 2025-10-01 14:00:04.441 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:00:04 compute-0 nova_compute[192698]: 2025-10-01 14:00:04.441 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:00:04 compute-0 nova_compute[192698]: 2025-10-01 14:00:04.441 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:00:04 compute-0 nova_compute[192698]: 2025-10-01 14:00:04.441 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 01 14:00:04 compute-0 nova_compute[192698]: 2025-10-01 14:00:04.615 2 WARNING nova.virt.libvirt.driver [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:00:04 compute-0 nova_compute[192698]: 2025-10-01 14:00:04.616 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:00:04 compute-0 nova_compute[192698]: 2025-10-01 14:00:04.644 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.028s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:00:04 compute-0 nova_compute[192698]: 2025-10-01 14:00:04.645 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6088MB free_disk=73.34098434448242GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 01 14:00:04 compute-0 nova_compute[192698]: 2025-10-01 14:00:04.645 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:00:04 compute-0 nova_compute[192698]: 2025-10-01 14:00:04.645 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:00:05 compute-0 nova_compute[192698]: 2025-10-01 14:00:05.687 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 01 14:00:05 compute-0 nova_compute[192698]: 2025-10-01 14:00:05.688 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:00:04 up 59 min,  0 user,  load average: 0.34, 0.57, 0.62\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 01 14:00:05 compute-0 nova_compute[192698]: 2025-10-01 14:00:05.718 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:00:06 compute-0 nova_compute[192698]: 2025-10-01 14:00:06.227 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:00:06 compute-0 nova_compute[192698]: 2025-10-01 14:00:06.737 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 01 14:00:06 compute-0 nova_compute[192698]: 2025-10-01 14:00:06.738 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.093s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:00:07 compute-0 nova_compute[192698]: 2025-10-01 14:00:07.738 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:00:07 compute-0 nova_compute[192698]: 2025-10-01 14:00:07.739 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:00:08 compute-0 nova_compute[192698]: 2025-10-01 14:00:08.924 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:00:08 compute-0 nova_compute[192698]: 2025-10-01 14:00:08.925 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 01 14:00:10 compute-0 podman[214920]: 2025-10-01 14:00:10.16719205 +0000 UTC m=+0.082429427 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, architecture=x86_64, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_id=edpm, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=)
Oct 01 14:00:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:00:14.225 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:00:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:00:14.226 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:00:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:00:14.226 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:00:15 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:00:15.244 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'e2:3f:3c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:1d:a6:67:ed:e6'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:00:15 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:00:15.246 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 01 14:00:15 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:00:15.247 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=10cf9814-09fa-4bad-879a-270f9b64eda3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:00:16 compute-0 podman[214944]: 2025-10-01 14:00:16.192374714 +0000 UTC m=+0.100092093 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, container_name=multipathd, io.buildah.version=1.41.4, config_id=multipathd)
Oct 01 14:00:16 compute-0 podman[214943]: 2025-10-01 14:00:16.201781508 +0000 UTC m=+0.115091118 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_managed=true, org.label-schema.license=GPLv2)
Oct 01 14:00:16 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:00:16.456 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c9:65:9e 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-8f4c2b16-767b-462f-b798-39ac905ce35c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8f4c2b16-767b-462f-b798-39ac905ce35c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b087266be77f415da92dcaa9bb9595d6', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=415199fa-db4c-4502-817b-6d547b8adebc, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=33780e04-9109-45a3-a7f3-dd36ae4a9769) old=Port_Binding(mac=['fa:16:3e:c9:65:9e'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-8f4c2b16-767b-462f-b798-39ac905ce35c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8f4c2b16-767b-462f-b798-39ac905ce35c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b087266be77f415da92dcaa9bb9595d6', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:00:16 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:00:16.457 103791 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 33780e04-9109-45a3-a7f3-dd36ae4a9769 in datapath 8f4c2b16-767b-462f-b798-39ac905ce35c updated
Oct 01 14:00:16 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:00:16.458 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8f4c2b16-767b-462f-b798-39ac905ce35c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 01 14:00:16 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:00:16.460 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[13dc5db8-0056-4566-8502-d92daeccba02]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:00:24 compute-0 podman[214982]: 2025-10-01 14:00:24.168060033 +0000 UTC m=+0.079455676 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 01 14:00:26 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:00:26.414 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:aa:18:34 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-4a2c7057-0729-4e73-9b05-dfc453a0b5b8', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4a2c7057-0729-4e73-9b05-dfc453a0b5b8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '44611ee9e84348a4911ca81efb29d6d6', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ef4a1a09-7758-46db-8959-0f04e3695e5b, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=5cadfd76-6034-486a-9f40-312f8e28442c) old=Port_Binding(mac=['fa:16:3e:aa:18:34'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-4a2c7057-0729-4e73-9b05-dfc453a0b5b8', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4a2c7057-0729-4e73-9b05-dfc453a0b5b8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '44611ee9e84348a4911ca81efb29d6d6', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:00:26 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:00:26.416 103791 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 5cadfd76-6034-486a-9f40-312f8e28442c in datapath 4a2c7057-0729-4e73-9b05-dfc453a0b5b8 updated
Oct 01 14:00:26 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:00:26.417 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4a2c7057-0729-4e73-9b05-dfc453a0b5b8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 01 14:00:26 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:00:26.418 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[c9a39bb6-ca67-461e-8362-d72f1301a0ad]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:00:29 compute-0 podman[203144]: time="2025-10-01T14:00:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:00:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:00:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:00:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:00:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2977 "" "Go-http-client/1.1"
Oct 01 14:00:31 compute-0 openstack_network_exporter[205307]: ERROR   14:00:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:00:31 compute-0 openstack_network_exporter[205307]: ERROR   14:00:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:00:31 compute-0 openstack_network_exporter[205307]: ERROR   14:00:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:00:31 compute-0 openstack_network_exporter[205307]: ERROR   14:00:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:00:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:00:31 compute-0 openstack_network_exporter[205307]: ERROR   14:00:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:00:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:00:33 compute-0 podman[215006]: 2025-10-01 14:00:33.175773718 +0000 UTC m=+0.086722313 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Oct 01 14:00:33 compute-0 podman[215007]: 2025-10-01 14:00:33.23696916 +0000 UTC m=+0.142265052 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller)
Oct 01 14:00:41 compute-0 podman[215051]: 2025-10-01 14:00:41.191792567 +0000 UTC m=+0.096719733 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, managed_by=edpm_ansible, version=9.6, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct 01 14:00:47 compute-0 podman[215073]: 2025-10-01 14:00:47.183793765 +0000 UTC m=+0.090038961 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.build-date=20250930, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=iscsid)
Oct 01 14:00:47 compute-0 podman[215074]: 2025-10-01 14:00:47.190595068 +0000 UTC m=+0.091647184 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Oct 01 14:00:55 compute-0 podman[215114]: 2025-10-01 14:00:55.164998343 +0000 UTC m=+0.073972628 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 01 14:00:59 compute-0 podman[203144]: time="2025-10-01T14:00:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:00:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:00:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:00:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:00:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2980 "" "Go-http-client/1.1"
Oct 01 14:01:01 compute-0 openstack_network_exporter[205307]: ERROR   14:01:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:01:01 compute-0 openstack_network_exporter[205307]: ERROR   14:01:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:01:01 compute-0 openstack_network_exporter[205307]: ERROR   14:01:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:01:01 compute-0 openstack_network_exporter[205307]: ERROR   14:01:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:01:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:01:01 compute-0 openstack_network_exporter[205307]: ERROR   14:01:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:01:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:01:01 compute-0 CROND[215139]: (root) CMD (run-parts /etc/cron.hourly)
Oct 01 14:01:01 compute-0 run-parts[215142]: (/etc/cron.hourly) starting 0anacron
Oct 01 14:01:01 compute-0 run-parts[215148]: (/etc/cron.hourly) finished 0anacron
Oct 01 14:01:01 compute-0 CROND[215138]: (root) CMDEND (run-parts /etc/cron.hourly)
Oct 01 14:01:02 compute-0 nova_compute[192698]: 2025-10-01 14:01:02.926 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:01:03 compute-0 nova_compute[192698]: 2025-10-01 14:01:03.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:01:03 compute-0 nova_compute[192698]: 2025-10-01 14:01:03.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:01:03 compute-0 nova_compute[192698]: 2025-10-01 14:01:03.926 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:01:04 compute-0 podman[215149]: 2025-10-01 14:01:04.169149752 +0000 UTC m=+0.078992564 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20250930, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Oct 01 14:01:04 compute-0 podman[215150]: 2025-10-01 14:01:04.228614658 +0000 UTC m=+0.131704778 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller)
Oct 01 14:01:04 compute-0 nova_compute[192698]: 2025-10-01 14:01:04.447 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:01:04 compute-0 nova_compute[192698]: 2025-10-01 14:01:04.448 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:01:04 compute-0 nova_compute[192698]: 2025-10-01 14:01:04.448 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:01:04 compute-0 nova_compute[192698]: 2025-10-01 14:01:04.448 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 01 14:01:04 compute-0 nova_compute[192698]: 2025-10-01 14:01:04.668 2 WARNING nova.virt.libvirt.driver [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:01:04 compute-0 nova_compute[192698]: 2025-10-01 14:01:04.670 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:01:04 compute-0 nova_compute[192698]: 2025-10-01 14:01:04.696 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.026s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:01:04 compute-0 nova_compute[192698]: 2025-10-01 14:01:04.698 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6090MB free_disk=73.34100341796875GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 01 14:01:04 compute-0 nova_compute[192698]: 2025-10-01 14:01:04.698 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:01:04 compute-0 nova_compute[192698]: 2025-10-01 14:01:04.698 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:01:05 compute-0 nova_compute[192698]: 2025-10-01 14:01:05.889 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 01 14:01:05 compute-0 nova_compute[192698]: 2025-10-01 14:01:05.890 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:01:04 up  1:00,  0 user,  load average: 0.12, 0.46, 0.58\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 01 14:01:05 compute-0 nova_compute[192698]: 2025-10-01 14:01:05.918 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:01:06 compute-0 nova_compute[192698]: 2025-10-01 14:01:06.427 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:01:06 compute-0 nova_compute[192698]: 2025-10-01 14:01:06.941 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 01 14:01:06 compute-0 nova_compute[192698]: 2025-10-01 14:01:06.942 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.243s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:01:07 compute-0 unix_chkpwd[215199]: password check failed for user (root)
Oct 01 14:01:07 compute-0 sshd-session[215197]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=91.224.92.32  user=root
Oct 01 14:01:07 compute-0 nova_compute[192698]: 2025-10-01 14:01:07.942 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:01:07 compute-0 nova_compute[192698]: 2025-10-01 14:01:07.944 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:01:07 compute-0 nova_compute[192698]: 2025-10-01 14:01:07.944 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:01:08 compute-0 nova_compute[192698]: 2025-10-01 14:01:08.914 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:01:09 compute-0 sshd-session[215197]: Failed password for root from 91.224.92.32 port 52172 ssh2
Oct 01 14:01:09 compute-0 nova_compute[192698]: 2025-10-01 14:01:09.424 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:01:09 compute-0 nova_compute[192698]: 2025-10-01 14:01:09.425 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 01 14:01:10 compute-0 unix_chkpwd[215200]: password check failed for user (root)
Oct 01 14:01:12 compute-0 podman[215201]: 2025-10-01 14:01:12.19144232 +0000 UTC m=+0.100081433 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, vcs-type=git, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.openshift.expose-services=, name=ubi9-minimal, io.buildah.version=1.33.7)
Oct 01 14:01:12 compute-0 sshd-session[215197]: Failed password for root from 91.224.92.32 port 52172 ssh2
Oct 01 14:01:13 compute-0 unix_chkpwd[215222]: password check failed for user (root)
Oct 01 14:01:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:01:14.227 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:01:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:01:14.227 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:01:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:01:14.227 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:01:15 compute-0 sshd-session[215197]: Failed password for root from 91.224.92.32 port 52172 ssh2
Oct 01 14:01:16 compute-0 sshd-session[215197]: Received disconnect from 91.224.92.32 port 52172:11:  [preauth]
Oct 01 14:01:16 compute-0 sshd-session[215197]: Disconnected from authenticating user root 91.224.92.32 port 52172 [preauth]
Oct 01 14:01:16 compute-0 sshd-session[215197]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=91.224.92.32  user=root
Oct 01 14:01:17 compute-0 unix_chkpwd[215226]: password check failed for user (root)
Oct 01 14:01:17 compute-0 sshd-session[215224]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=91.224.92.32  user=root
Oct 01 14:01:18 compute-0 podman[215227]: 2025-10-01 14:01:18.173434538 +0000 UTC m=+0.087403481 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 01 14:01:18 compute-0 podman[215228]: 2025-10-01 14:01:18.195849643 +0000 UTC m=+0.103425693 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct 01 14:01:18 compute-0 sshd-session[215224]: Failed password for root from 91.224.92.32 port 27558 ssh2
Oct 01 14:01:19 compute-0 unix_chkpwd[215268]: password check failed for user (root)
Oct 01 14:01:21 compute-0 sshd-session[215224]: Failed password for root from 91.224.92.32 port 27558 ssh2
Oct 01 14:01:22 compute-0 unix_chkpwd[215269]: password check failed for user (root)
Oct 01 14:01:24 compute-0 sshd-session[215224]: Failed password for root from 91.224.92.32 port 27558 ssh2
Oct 01 14:01:25 compute-0 sshd-session[215224]: Received disconnect from 91.224.92.32 port 27558:11:  [preauth]
Oct 01 14:01:25 compute-0 sshd-session[215224]: Disconnected from authenticating user root 91.224.92.32 port 27558 [preauth]
Oct 01 14:01:25 compute-0 sshd-session[215224]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=91.224.92.32  user=root
Oct 01 14:01:26 compute-0 podman[215272]: 2025-10-01 14:01:26.182044687 +0000 UTC m=+0.092304834 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 01 14:01:26 compute-0 unix_chkpwd[215298]: password check failed for user (root)
Oct 01 14:01:26 compute-0 sshd-session[215270]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=91.224.92.32  user=root
Oct 01 14:01:28 compute-0 sshd-session[215270]: Failed password for root from 91.224.92.32 port 54496 ssh2
Oct 01 14:01:29 compute-0 unix_chkpwd[215299]: password check failed for user (root)
Oct 01 14:01:29 compute-0 podman[203144]: time="2025-10-01T14:01:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:01:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:01:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:01:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:01:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2983 "" "Go-http-client/1.1"
Oct 01 14:01:31 compute-0 sshd-session[215270]: Failed password for root from 91.224.92.32 port 54496 ssh2
Oct 01 14:01:31 compute-0 openstack_network_exporter[205307]: ERROR   14:01:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:01:31 compute-0 openstack_network_exporter[205307]: ERROR   14:01:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:01:31 compute-0 openstack_network_exporter[205307]: ERROR   14:01:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:01:31 compute-0 openstack_network_exporter[205307]: ERROR   14:01:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:01:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:01:31 compute-0 openstack_network_exporter[205307]: ERROR   14:01:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:01:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:01:32 compute-0 unix_chkpwd[215301]: password check failed for user (root)
Oct 01 14:01:34 compute-0 sshd-session[215270]: Failed password for root from 91.224.92.32 port 54496 ssh2
Oct 01 14:01:34 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:01:34.980 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'e2:3f:3c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:1d:a6:67:ed:e6'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:01:34 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:01:34.981 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 01 14:01:35 compute-0 podman[215303]: 2025-10-01 14:01:35.176846772 +0000 UTC m=+0.080114294 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0)
Oct 01 14:01:35 compute-0 sshd-session[215270]: Received disconnect from 91.224.92.32 port 54496:11:  [preauth]
Oct 01 14:01:35 compute-0 sshd-session[215270]: Disconnected from authenticating user root 91.224.92.32 port 54496 [preauth]
Oct 01 14:01:35 compute-0 sshd-session[215270]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=91.224.92.32  user=root
Oct 01 14:01:35 compute-0 podman[215304]: 2025-10-01 14:01:35.22713706 +0000 UTC m=+0.123332931 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest)
Oct 01 14:01:43 compute-0 podman[215348]: 2025-10-01 14:01:43.219262265 +0000 UTC m=+0.121029869 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., distribution-scope=public, managed_by=edpm_ansible, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350, vcs-type=git, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers)
Oct 01 14:01:43 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:01:43.983 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=10cf9814-09fa-4bad-879a-270f9b64eda3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:01:49 compute-0 podman[215369]: 2025-10-01 14:01:49.189825674 +0000 UTC m=+0.099633141 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20250930, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Oct 01 14:01:49 compute-0 podman[215370]: 2025-10-01 14:01:49.201705345 +0000 UTC m=+0.101935904 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd)
Oct 01 14:01:57 compute-0 podman[215409]: 2025-10-01 14:01:57.180851688 +0000 UTC m=+0.087779761 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 01 14:01:59 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:01:59.031 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4a:8b:50 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-8b83843d-74e6-401a-9419-27491d8fece7', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8b83843d-74e6-401a-9419-27491d8fece7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '18a5e4cbf5004eb69c4a4632324f35d0', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=74ce129b-0e7f-4161-8565-94672e8a679d, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=bba308e4-f061-4ef4-89aa-6683f56ddb01) old=Port_Binding(mac=['fa:16:3e:4a:8b:50'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-8b83843d-74e6-401a-9419-27491d8fece7', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8b83843d-74e6-401a-9419-27491d8fece7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '18a5e4cbf5004eb69c4a4632324f35d0', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:01:59 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:01:59.033 103791 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port bba308e4-f061-4ef4-89aa-6683f56ddb01 in datapath 8b83843d-74e6-401a-9419-27491d8fece7 updated
Oct 01 14:01:59 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:01:59.034 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8b83843d-74e6-401a-9419-27491d8fece7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 01 14:01:59 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:01:59.035 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[3988a659-6831-4231-834f-fc03d407bfc1]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:01:59 compute-0 podman[203144]: time="2025-10-01T14:01:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:01:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:01:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:01:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:01:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2981 "" "Go-http-client/1.1"
Oct 01 14:02:01 compute-0 openstack_network_exporter[205307]: ERROR   14:02:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:02:01 compute-0 openstack_network_exporter[205307]: ERROR   14:02:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:02:01 compute-0 openstack_network_exporter[205307]: ERROR   14:02:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:02:01 compute-0 openstack_network_exporter[205307]: ERROR   14:02:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:02:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:02:01 compute-0 openstack_network_exporter[205307]: ERROR   14:02:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:02:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:02:02 compute-0 nova_compute[192698]: 2025-10-01 14:02:02.926 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:02:03 compute-0 nova_compute[192698]: 2025-10-01 14:02:03.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:02:04 compute-0 nova_compute[192698]: 2025-10-01 14:02:04.446 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:02:04 compute-0 nova_compute[192698]: 2025-10-01 14:02:04.447 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:02:04 compute-0 nova_compute[192698]: 2025-10-01 14:02:04.447 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:02:04 compute-0 nova_compute[192698]: 2025-10-01 14:02:04.447 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 01 14:02:04 compute-0 nova_compute[192698]: 2025-10-01 14:02:04.659 2 WARNING nova.virt.libvirt.driver [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:02:04 compute-0 nova_compute[192698]: 2025-10-01 14:02:04.661 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:02:04 compute-0 nova_compute[192698]: 2025-10-01 14:02:04.687 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.026s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:02:04 compute-0 nova_compute[192698]: 2025-10-01 14:02:04.688 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6089MB free_disk=73.3408088684082GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 01 14:02:04 compute-0 nova_compute[192698]: 2025-10-01 14:02:04.688 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:02:04 compute-0 nova_compute[192698]: 2025-10-01 14:02:04.689 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:02:05 compute-0 nova_compute[192698]: 2025-10-01 14:02:05.784 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 01 14:02:05 compute-0 nova_compute[192698]: 2025-10-01 14:02:05.785 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:02:04 up  1:01,  0 user,  load average: 0.10, 0.39, 0.55\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 01 14:02:05 compute-0 nova_compute[192698]: 2025-10-01 14:02:05.811 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:02:05 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:02:05.983 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4b:6d:04 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-9e8c5a77-0a6a-4a3c-a27f-9465cabc782c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9e8c5a77-0a6a-4a3c-a27f-9465cabc782c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd557b5b6333c4a08801c674394739795', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=869fa728-34d5-4312-ae48-dfe792b5189f, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=e82d19af-aa72-4179-9c65-914a9e71e249) old=Port_Binding(mac=['fa:16:3e:4b:6d:04'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-9e8c5a77-0a6a-4a3c-a27f-9465cabc782c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9e8c5a77-0a6a-4a3c-a27f-9465cabc782c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd557b5b6333c4a08801c674394739795', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:02:05 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:02:05.985 103791 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port e82d19af-aa72-4179-9c65-914a9e71e249 in datapath 9e8c5a77-0a6a-4a3c-a27f-9465cabc782c updated
Oct 01 14:02:05 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:02:05.986 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9e8c5a77-0a6a-4a3c-a27f-9465cabc782c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 01 14:02:05 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:02:05.987 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[d803171e-4caf-44da-a764-b3737b23cac9]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:02:06 compute-0 podman[215435]: 2025-10-01 14:02:06.186851516 +0000 UTC m=+0.093018482 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=watcher_latest)
Oct 01 14:02:06 compute-0 podman[215436]: 2025-10-01 14:02:06.280944907 +0000 UTC m=+0.183260209 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Oct 01 14:02:06 compute-0 nova_compute[192698]: 2025-10-01 14:02:06.320 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:02:06 compute-0 nova_compute[192698]: 2025-10-01 14:02:06.832 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 01 14:02:06 compute-0 nova_compute[192698]: 2025-10-01 14:02:06.832 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.143s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:02:07 compute-0 nova_compute[192698]: 2025-10-01 14:02:07.832 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:02:07 compute-0 nova_compute[192698]: 2025-10-01 14:02:07.833 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:02:07 compute-0 nova_compute[192698]: 2025-10-01 14:02:07.833 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:02:07 compute-0 nova_compute[192698]: 2025-10-01 14:02:07.833 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:02:07 compute-0 nova_compute[192698]: 2025-10-01 14:02:07.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:02:08 compute-0 nova_compute[192698]: 2025-10-01 14:02:08.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:02:08 compute-0 nova_compute[192698]: 2025-10-01 14:02:08.925 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 01 14:02:14 compute-0 podman[215481]: 2025-10-01 14:02:14.161465957 +0000 UTC m=+0.067351539 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, maintainer=Red Hat, Inc., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm)
Oct 01 14:02:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:02:14.228 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:02:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:02:14.229 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:02:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:02:14.229 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:02:20 compute-0 podman[215503]: 2025-10-01 14:02:20.149674194 +0000 UTC m=+0.065041947 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 01 14:02:20 compute-0 podman[215504]: 2025-10-01 14:02:20.164438753 +0000 UTC m=+0.067640078 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd)
Oct 01 14:02:28 compute-0 podman[215541]: 2025-10-01 14:02:28.213172113 +0000 UTC m=+0.077357390 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 01 14:02:29 compute-0 podman[203144]: time="2025-10-01T14:02:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:02:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:02:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:02:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:02:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2981 "" "Go-http-client/1.1"
Oct 01 14:02:31 compute-0 openstack_network_exporter[205307]: ERROR   14:02:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:02:31 compute-0 openstack_network_exporter[205307]: ERROR   14:02:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:02:31 compute-0 openstack_network_exporter[205307]: ERROR   14:02:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:02:31 compute-0 openstack_network_exporter[205307]: ERROR   14:02:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:02:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:02:31 compute-0 openstack_network_exporter[205307]: ERROR   14:02:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:02:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:02:37 compute-0 podman[215566]: 2025-10-01 14:02:37.155900774 +0000 UTC m=+0.072856748 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20250930, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Oct 01 14:02:37 compute-0 podman[215567]: 2025-10-01 14:02:37.226632514 +0000 UTC m=+0.134048971 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 01 14:02:37 compute-0 nova_compute[192698]: 2025-10-01 14:02:37.725 2 DEBUG oslo_concurrency.lockutils [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Acquiring lock "6bbb3240-f185-4efc-9aaa-ed008923c68a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:02:37 compute-0 nova_compute[192698]: 2025-10-01 14:02:37.725 2 DEBUG oslo_concurrency.lockutils [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Lock "6bbb3240-f185-4efc-9aaa-ed008923c68a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:02:38 compute-0 nova_compute[192698]: 2025-10-01 14:02:38.233 2 DEBUG nova.compute.manager [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Oct 01 14:02:38 compute-0 nova_compute[192698]: 2025-10-01 14:02:38.884 2 DEBUG oslo_concurrency.lockutils [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:02:38 compute-0 nova_compute[192698]: 2025-10-01 14:02:38.886 2 DEBUG oslo_concurrency.lockutils [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:02:38 compute-0 nova_compute[192698]: 2025-10-01 14:02:38.894 2 DEBUG nova.virt.hardware [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Oct 01 14:02:38 compute-0 nova_compute[192698]: 2025-10-01 14:02:38.895 2 INFO nova.compute.claims [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] Claim successful on node compute-0.ctlplane.example.com
Oct 01 14:02:39 compute-0 nova_compute[192698]: 2025-10-01 14:02:39.954 2 DEBUG nova.compute.provider_tree [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:02:40 compute-0 nova_compute[192698]: 2025-10-01 14:02:40.462 2 DEBUG nova.scheduler.client.report [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:02:40 compute-0 nova_compute[192698]: 2025-10-01 14:02:40.972 2 DEBUG oslo_concurrency.lockutils [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.086s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:02:40 compute-0 nova_compute[192698]: 2025-10-01 14:02:40.973 2 DEBUG nova.compute.manager [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Oct 01 14:02:41 compute-0 nova_compute[192698]: 2025-10-01 14:02:41.486 2 DEBUG nova.compute.manager [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Oct 01 14:02:41 compute-0 nova_compute[192698]: 2025-10-01 14:02:41.486 2 DEBUG nova.network.neutron [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Oct 01 14:02:41 compute-0 nova_compute[192698]: 2025-10-01 14:02:41.487 2 WARNING neutronclient.v2_0.client [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:02:41 compute-0 nova_compute[192698]: 2025-10-01 14:02:41.489 2 WARNING neutronclient.v2_0.client [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:02:42 compute-0 nova_compute[192698]: 2025-10-01 14:02:42.005 2 INFO nova.virt.libvirt.driver [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 01 14:02:42 compute-0 nova_compute[192698]: 2025-10-01 14:02:42.516 2 DEBUG nova.compute.manager [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Oct 01 14:02:42 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:02:42.923 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'e2:3f:3c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:1d:a6:67:ed:e6'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:02:42 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:02:42.924 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 01 14:02:43 compute-0 nova_compute[192698]: 2025-10-01 14:02:43.159 2 DEBUG nova.network.neutron [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] Successfully created port: 6919878f-3d2e-4ab5-b34c-215a0ff9579b _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Oct 01 14:02:43 compute-0 nova_compute[192698]: 2025-10-01 14:02:43.539 2 DEBUG nova.compute.manager [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Oct 01 14:02:43 compute-0 nova_compute[192698]: 2025-10-01 14:02:43.541 2 DEBUG nova.virt.libvirt.driver [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Oct 01 14:02:43 compute-0 nova_compute[192698]: 2025-10-01 14:02:43.541 2 INFO nova.virt.libvirt.driver [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] Creating image(s)
Oct 01 14:02:43 compute-0 nova_compute[192698]: 2025-10-01 14:02:43.542 2 DEBUG oslo_concurrency.lockutils [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Acquiring lock "/var/lib/nova/instances/6bbb3240-f185-4efc-9aaa-ed008923c68a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:02:43 compute-0 nova_compute[192698]: 2025-10-01 14:02:43.543 2 DEBUG oslo_concurrency.lockutils [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Lock "/var/lib/nova/instances/6bbb3240-f185-4efc-9aaa-ed008923c68a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:02:43 compute-0 nova_compute[192698]: 2025-10-01 14:02:43.544 2 DEBUG oslo_concurrency.lockutils [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Lock "/var/lib/nova/instances/6bbb3240-f185-4efc-9aaa-ed008923c68a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:02:43 compute-0 nova_compute[192698]: 2025-10-01 14:02:43.544 2 DEBUG oslo_concurrency.lockutils [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Acquiring lock "f477473ce09fdc00484ca839f539813eb2fee546" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:02:43 compute-0 nova_compute[192698]: 2025-10-01 14:02:43.545 2 DEBUG oslo_concurrency.lockutils [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Lock "f477473ce09fdc00484ca839f539813eb2fee546" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:02:43 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:02:43.927 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=10cf9814-09fa-4bad-879a-270f9b64eda3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:02:44 compute-0 nova_compute[192698]: 2025-10-01 14:02:44.572 2 DEBUG oslo_utils.imageutils.format_inspector [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'QFI\xfb') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:02:44 compute-0 nova_compute[192698]: 2025-10-01 14:02:44.577 2 DEBUG oslo_utils.imageutils.format_inspector [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:02:44 compute-0 nova_compute[192698]: 2025-10-01 14:02:44.577 2 DEBUG oslo_concurrency.processutils [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546.part --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:02:44 compute-0 nova_compute[192698]: 2025-10-01 14:02:44.654 2 DEBUG oslo_concurrency.processutils [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546.part --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:02:44 compute-0 nova_compute[192698]: 2025-10-01 14:02:44.656 2 DEBUG nova.virt.images [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] 48696e9b-a20d-4bf6-8ac2-6438fe748ab6 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.12/site-packages/nova/virt/images.py:278
Oct 01 14:02:44 compute-0 nova_compute[192698]: 2025-10-01 14:02:44.658 2 DEBUG nova.privsep.utils [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.12/site-packages/nova/privsep/utils.py:63
Oct 01 14:02:44 compute-0 nova_compute[192698]: 2025-10-01 14:02:44.658 2 DEBUG oslo_concurrency.processutils [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546.part /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546.converted execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:02:44 compute-0 nova_compute[192698]: 2025-10-01 14:02:44.865 2 DEBUG oslo_concurrency.processutils [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546.part /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546.converted" returned: 0 in 0.207s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:02:44 compute-0 nova_compute[192698]: 2025-10-01 14:02:44.873 2 DEBUG oslo_concurrency.processutils [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546.converted --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:02:44 compute-0 nova_compute[192698]: 2025-10-01 14:02:44.956 2 DEBUG oslo_concurrency.processutils [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546.converted --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:02:44 compute-0 nova_compute[192698]: 2025-10-01 14:02:44.957 2 DEBUG oslo_concurrency.lockutils [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Lock "f477473ce09fdc00484ca839f539813eb2fee546" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.412s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:02:44 compute-0 nova_compute[192698]: 2025-10-01 14:02:44.958 2 DEBUG oslo_utils.imageutils.format_inspector [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:02:44 compute-0 nova_compute[192698]: 2025-10-01 14:02:44.962 2 DEBUG oslo_utils.imageutils.format_inspector [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:02:44 compute-0 nova_compute[192698]: 2025-10-01 14:02:44.964 2 INFO oslo.privsep.daemon [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmp4b2bqmi1/privsep.sock']
Oct 01 14:02:45 compute-0 nova_compute[192698]: 2025-10-01 14:02:45.036 2 DEBUG nova.network.neutron [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] Successfully updated port: 6919878f-3d2e-4ab5-b34c-215a0ff9579b _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Oct 01 14:02:45 compute-0 nova_compute[192698]: 2025-10-01 14:02:45.093 2 DEBUG nova.compute.manager [req-8f9e27c5-c217-47d7-8e54-8f383865ae40 req-efd660fd-5f38-4f6e-aa36-ece1bd46798b 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] Received event network-changed-6919878f-3d2e-4ab5-b34c-215a0ff9579b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:02:45 compute-0 nova_compute[192698]: 2025-10-01 14:02:45.094 2 DEBUG nova.compute.manager [req-8f9e27c5-c217-47d7-8e54-8f383865ae40 req-efd660fd-5f38-4f6e-aa36-ece1bd46798b 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] Refreshing instance network info cache due to event network-changed-6919878f-3d2e-4ab5-b34c-215a0ff9579b. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Oct 01 14:02:45 compute-0 nova_compute[192698]: 2025-10-01 14:02:45.094 2 DEBUG oslo_concurrency.lockutils [req-8f9e27c5-c217-47d7-8e54-8f383865ae40 req-efd660fd-5f38-4f6e-aa36-ece1bd46798b 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "refresh_cache-6bbb3240-f185-4efc-9aaa-ed008923c68a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 01 14:02:45 compute-0 nova_compute[192698]: 2025-10-01 14:02:45.095 2 DEBUG oslo_concurrency.lockutils [req-8f9e27c5-c217-47d7-8e54-8f383865ae40 req-efd660fd-5f38-4f6e-aa36-ece1bd46798b 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquired lock "refresh_cache-6bbb3240-f185-4efc-9aaa-ed008923c68a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 01 14:02:45 compute-0 nova_compute[192698]: 2025-10-01 14:02:45.096 2 DEBUG nova.network.neutron [req-8f9e27c5-c217-47d7-8e54-8f383865ae40 req-efd660fd-5f38-4f6e-aa36-ece1bd46798b 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] Refreshing network info cache for port 6919878f-3d2e-4ab5-b34c-215a0ff9579b _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Oct 01 14:02:45 compute-0 podman[215628]: 2025-10-01 14:02:45.15742041 +0000 UTC m=+0.077861433 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, vendor=Red Hat, Inc., io.openshift.expose-services=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., architecture=x86_64)
Oct 01 14:02:45 compute-0 nova_compute[192698]: 2025-10-01 14:02:45.544 2 DEBUG oslo_concurrency.lockutils [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Acquiring lock "refresh_cache-6bbb3240-f185-4efc-9aaa-ed008923c68a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 01 14:02:45 compute-0 nova_compute[192698]: 2025-10-01 14:02:45.603 2 WARNING neutronclient.v2_0.client [req-8f9e27c5-c217-47d7-8e54-8f383865ae40 req-efd660fd-5f38-4f6e-aa36-ece1bd46798b 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:02:45 compute-0 nova_compute[192698]: 2025-10-01 14:02:45.772 2 INFO oslo.privsep.daemon [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Spawned new privsep daemon via rootwrap
Oct 01 14:02:45 compute-0 nova_compute[192698]: 2025-10-01 14:02:45.567 65 INFO oslo.privsep.daemon [-] privsep daemon starting
Oct 01 14:02:45 compute-0 nova_compute[192698]: 2025-10-01 14:02:45.574 65 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Oct 01 14:02:45 compute-0 nova_compute[192698]: 2025-10-01 14:02:45.577 65 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Oct 01 14:02:45 compute-0 nova_compute[192698]: 2025-10-01 14:02:45.578 65 INFO oslo.privsep.daemon [-] privsep daemon running as pid 65
Oct 01 14:02:45 compute-0 nova_compute[192698]: 2025-10-01 14:02:45.867 2 DEBUG oslo_concurrency.processutils [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:02:45 compute-0 nova_compute[192698]: 2025-10-01 14:02:45.908 2 DEBUG nova.network.neutron [req-8f9e27c5-c217-47d7-8e54-8f383865ae40 req-efd660fd-5f38-4f6e-aa36-ece1bd46798b 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 01 14:02:45 compute-0 nova_compute[192698]: 2025-10-01 14:02:45.926 2 DEBUG oslo_concurrency.processutils [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:02:45 compute-0 nova_compute[192698]: 2025-10-01 14:02:45.927 2 DEBUG oslo_concurrency.lockutils [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Acquiring lock "f477473ce09fdc00484ca839f539813eb2fee546" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:02:45 compute-0 nova_compute[192698]: 2025-10-01 14:02:45.928 2 DEBUG oslo_concurrency.lockutils [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Lock "f477473ce09fdc00484ca839f539813eb2fee546" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:02:45 compute-0 nova_compute[192698]: 2025-10-01 14:02:45.929 2 DEBUG oslo_utils.imageutils.format_inspector [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:02:45 compute-0 nova_compute[192698]: 2025-10-01 14:02:45.935 2 DEBUG oslo_utils.imageutils.format_inspector [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:02:45 compute-0 nova_compute[192698]: 2025-10-01 14:02:45.936 2 DEBUG oslo_concurrency.processutils [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:02:45 compute-0 nova_compute[192698]: 2025-10-01 14:02:45.995 2 DEBUG oslo_concurrency.processutils [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:02:45 compute-0 nova_compute[192698]: 2025-10-01 14:02:45.997 2 DEBUG oslo_concurrency.processutils [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546,backing_fmt=raw /var/lib/nova/instances/6bbb3240-f185-4efc-9aaa-ed008923c68a/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:02:46 compute-0 nova_compute[192698]: 2025-10-01 14:02:46.049 2 DEBUG oslo_concurrency.processutils [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546,backing_fmt=raw /var/lib/nova/instances/6bbb3240-f185-4efc-9aaa-ed008923c68a/disk 1073741824" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:02:46 compute-0 nova_compute[192698]: 2025-10-01 14:02:46.051 2 DEBUG oslo_concurrency.lockutils [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Lock "f477473ce09fdc00484ca839f539813eb2fee546" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.123s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:02:46 compute-0 nova_compute[192698]: 2025-10-01 14:02:46.051 2 DEBUG oslo_concurrency.processutils [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:02:46 compute-0 nova_compute[192698]: 2025-10-01 14:02:46.084 2 DEBUG nova.network.neutron [req-8f9e27c5-c217-47d7-8e54-8f383865ae40 req-efd660fd-5f38-4f6e-aa36-ece1bd46798b 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:02:46 compute-0 nova_compute[192698]: 2025-10-01 14:02:46.109 2 DEBUG oslo_concurrency.processutils [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:02:46 compute-0 nova_compute[192698]: 2025-10-01 14:02:46.110 2 DEBUG nova.virt.disk.api [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Checking if we can resize image /var/lib/nova/instances/6bbb3240-f185-4efc-9aaa-ed008923c68a/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 01 14:02:46 compute-0 nova_compute[192698]: 2025-10-01 14:02:46.111 2 DEBUG oslo_concurrency.processutils [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6bbb3240-f185-4efc-9aaa-ed008923c68a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:02:46 compute-0 nova_compute[192698]: 2025-10-01 14:02:46.198 2 DEBUG oslo_concurrency.processutils [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6bbb3240-f185-4efc-9aaa-ed008923c68a/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:02:46 compute-0 nova_compute[192698]: 2025-10-01 14:02:46.199 2 DEBUG nova.virt.disk.api [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Cannot resize image /var/lib/nova/instances/6bbb3240-f185-4efc-9aaa-ed008923c68a/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 01 14:02:46 compute-0 nova_compute[192698]: 2025-10-01 14:02:46.200 2 DEBUG nova.virt.libvirt.driver [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Oct 01 14:02:46 compute-0 nova_compute[192698]: 2025-10-01 14:02:46.201 2 DEBUG nova.virt.libvirt.driver [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] Ensure instance console log exists: /var/lib/nova/instances/6bbb3240-f185-4efc-9aaa-ed008923c68a/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Oct 01 14:02:46 compute-0 nova_compute[192698]: 2025-10-01 14:02:46.201 2 DEBUG oslo_concurrency.lockutils [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:02:46 compute-0 nova_compute[192698]: 2025-10-01 14:02:46.202 2 DEBUG oslo_concurrency.lockutils [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:02:46 compute-0 nova_compute[192698]: 2025-10-01 14:02:46.202 2 DEBUG oslo_concurrency.lockutils [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:02:46 compute-0 nova_compute[192698]: 2025-10-01 14:02:46.593 2 DEBUG oslo_concurrency.lockutils [req-8f9e27c5-c217-47d7-8e54-8f383865ae40 req-efd660fd-5f38-4f6e-aa36-ece1bd46798b 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Releasing lock "refresh_cache-6bbb3240-f185-4efc-9aaa-ed008923c68a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 01 14:02:46 compute-0 nova_compute[192698]: 2025-10-01 14:02:46.594 2 DEBUG oslo_concurrency.lockutils [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Acquired lock "refresh_cache-6bbb3240-f185-4efc-9aaa-ed008923c68a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 01 14:02:46 compute-0 nova_compute[192698]: 2025-10-01 14:02:46.595 2 DEBUG nova.network.neutron [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 01 14:02:47 compute-0 nova_compute[192698]: 2025-10-01 14:02:47.934 2 DEBUG nova.network.neutron [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 01 14:02:48 compute-0 nova_compute[192698]: 2025-10-01 14:02:48.162 2 WARNING neutronclient.v2_0.client [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:02:49 compute-0 nova_compute[192698]: 2025-10-01 14:02:49.010 2 DEBUG nova.network.neutron [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] Updating instance_info_cache with network_info: [{"id": "6919878f-3d2e-4ab5-b34c-215a0ff9579b", "address": "fa:16:3e:3b:f1:fb", "network": {"id": "8b83843d-74e6-401a-9419-27491d8fece7", "bridge": "br-int", "label": "tempest-TestDataModel-780178721-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "18a5e4cbf5004eb69c4a4632324f35d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6919878f-3d", "ovs_interfaceid": "6919878f-3d2e-4ab5-b34c-215a0ff9579b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:02:49 compute-0 nova_compute[192698]: 2025-10-01 14:02:49.517 2 DEBUG oslo_concurrency.lockutils [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Releasing lock "refresh_cache-6bbb3240-f185-4efc-9aaa-ed008923c68a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 01 14:02:49 compute-0 nova_compute[192698]: 2025-10-01 14:02:49.518 2 DEBUG nova.compute.manager [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] Instance network_info: |[{"id": "6919878f-3d2e-4ab5-b34c-215a0ff9579b", "address": "fa:16:3e:3b:f1:fb", "network": {"id": "8b83843d-74e6-401a-9419-27491d8fece7", "bridge": "br-int", "label": "tempest-TestDataModel-780178721-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "18a5e4cbf5004eb69c4a4632324f35d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6919878f-3d", "ovs_interfaceid": "6919878f-3d2e-4ab5-b34c-215a0ff9579b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Oct 01 14:02:49 compute-0 nova_compute[192698]: 2025-10-01 14:02:49.524 2 DEBUG nova.virt.libvirt.driver [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] Start _get_guest_xml network_info=[{"id": "6919878f-3d2e-4ab5-b34c-215a0ff9579b", "address": "fa:16:3e:3b:f1:fb", "network": {"id": "8b83843d-74e6-401a-9419-27491d8fece7", "bridge": "br-int", "label": "tempest-TestDataModel-780178721-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "18a5e4cbf5004eb69c4a4632324f35d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6919878f-3d", "ovs_interfaceid": "6919878f-3d2e-4ab5-b34c-215a0ff9579b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-01T13:57:39Z,direct_url=<?>,disk_format='qcow2',id=48696e9b-a20d-4bf6-8ac2-6438fe748ab6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9dacac6049d34f02846f752af09ae16f',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-01T13:57:40Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encrypted': False, 'encryption_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_options': None, 'device_name': '/dev/vda', 'guest_format': None, 'image_id': '48696e9b-a20d-4bf6-8ac2-6438fe748ab6'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Oct 01 14:02:49 compute-0 nova_compute[192698]: 2025-10-01 14:02:49.531 2 WARNING nova.virt.libvirt.driver [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:02:49 compute-0 nova_compute[192698]: 2025-10-01 14:02:49.534 2 DEBUG nova.virt.driver [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='48696e9b-a20d-4bf6-8ac2-6438fe748ab6', instance_meta=NovaInstanceMeta(name='tempest-TestDataModel-server-1148779107', uuid='6bbb3240-f185-4efc-9aaa-ed008923c68a'), owner=OwnerMeta(userid='0874d87de40e4626b04c1e4b35a90268', username='tempest-TestDataModel-621315703-project-admin', projectid='d557b5b6333c4a08801c674394739795', projectname='tempest-TestDataModel-621315703'), image=ImageMeta(id='48696e9b-a20d-4bf6-8ac2-6438fe748ab6', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='69702c4b-38f2-49d1-96d5-85671652c67e', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "6919878f-3d2e-4ab5-b34c-215a0ff9579b", "address": "fa:16:3e:3b:f1:fb", "network": {"id": "8b83843d-74e6-401a-9419-27491d8fece7", "bridge": "br-int", "label": "tempest-TestDataModel-780178721-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "18a5e4cbf5004eb69c4a4632324f35d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6919878f-3d", "ovs_interfaceid": "6919878f-3d2e-4ab5-b34c-215a0ff9579b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759327369.5341418) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Oct 01 14:02:49 compute-0 nova_compute[192698]: 2025-10-01 14:02:49.541 2 DEBUG nova.virt.libvirt.host [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Oct 01 14:02:49 compute-0 nova_compute[192698]: 2025-10-01 14:02:49.542 2 DEBUG nova.virt.libvirt.host [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Oct 01 14:02:49 compute-0 nova_compute[192698]: 2025-10-01 14:02:49.545 2 DEBUG nova.virt.libvirt.host [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Oct 01 14:02:49 compute-0 nova_compute[192698]: 2025-10-01 14:02:49.546 2 DEBUG nova.virt.libvirt.host [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Oct 01 14:02:49 compute-0 nova_compute[192698]: 2025-10-01 14:02:49.547 2 DEBUG nova.virt.libvirt.driver [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Oct 01 14:02:49 compute-0 nova_compute[192698]: 2025-10-01 14:02:49.547 2 DEBUG nova.virt.hardware [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-01T13:57:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='69702c4b-38f2-49d1-96d5-85671652c67e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-01T13:57:39Z,direct_url=<?>,disk_format='qcow2',id=48696e9b-a20d-4bf6-8ac2-6438fe748ab6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9dacac6049d34f02846f752af09ae16f',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-01T13:57:40Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Oct 01 14:02:49 compute-0 nova_compute[192698]: 2025-10-01 14:02:49.548 2 DEBUG nova.virt.hardware [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Oct 01 14:02:49 compute-0 nova_compute[192698]: 2025-10-01 14:02:49.549 2 DEBUG nova.virt.hardware [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Oct 01 14:02:49 compute-0 nova_compute[192698]: 2025-10-01 14:02:49.549 2 DEBUG nova.virt.hardware [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Oct 01 14:02:49 compute-0 nova_compute[192698]: 2025-10-01 14:02:49.550 2 DEBUG nova.virt.hardware [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Oct 01 14:02:49 compute-0 nova_compute[192698]: 2025-10-01 14:02:49.550 2 DEBUG nova.virt.hardware [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Oct 01 14:02:49 compute-0 nova_compute[192698]: 2025-10-01 14:02:49.551 2 DEBUG nova.virt.hardware [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Oct 01 14:02:49 compute-0 nova_compute[192698]: 2025-10-01 14:02:49.551 2 DEBUG nova.virt.hardware [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Oct 01 14:02:49 compute-0 nova_compute[192698]: 2025-10-01 14:02:49.552 2 DEBUG nova.virt.hardware [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Oct 01 14:02:49 compute-0 nova_compute[192698]: 2025-10-01 14:02:49.552 2 DEBUG nova.virt.hardware [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Oct 01 14:02:49 compute-0 nova_compute[192698]: 2025-10-01 14:02:49.552 2 DEBUG nova.virt.hardware [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Oct 01 14:02:49 compute-0 nova_compute[192698]: 2025-10-01 14:02:49.559 2 DEBUG nova.privsep.utils [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.12/site-packages/nova/privsep/utils.py:63
Oct 01 14:02:49 compute-0 nova_compute[192698]: 2025-10-01 14:02:49.561 2 DEBUG nova.virt.libvirt.vif [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-01T14:02:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestDataModel-server-1148779107',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testdatamodel-server-1148779107',id=3,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d557b5b6333c4a08801c674394739795',ramdisk_id='',reservation_id='r-nh11smmq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestDataModel-621315703',owner_user_name='tempest-TestDataModel-621315703-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-01T14:02:42Z,user_data=None,user_id='0874d87de40e4626b04c1e4b35a90268',uuid=6bbb3240-f185-4efc-9aaa-ed008923c68a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6919878f-3d2e-4ab5-b34c-215a0ff9579b", "address": "fa:16:3e:3b:f1:fb", "network": {"id": "8b83843d-74e6-401a-9419-27491d8fece7", "bridge": "br-int", "label": "tempest-TestDataModel-780178721-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "18a5e4cbf5004eb69c4a4632324f35d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6919878f-3d", "ovs_interfaceid": "6919878f-3d2e-4ab5-b34c-215a0ff9579b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Oct 01 14:02:49 compute-0 nova_compute[192698]: 2025-10-01 14:02:49.562 2 DEBUG nova.network.os_vif_util [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Converting VIF {"id": "6919878f-3d2e-4ab5-b34c-215a0ff9579b", "address": "fa:16:3e:3b:f1:fb", "network": {"id": "8b83843d-74e6-401a-9419-27491d8fece7", "bridge": "br-int", "label": "tempest-TestDataModel-780178721-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "18a5e4cbf5004eb69c4a4632324f35d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6919878f-3d", "ovs_interfaceid": "6919878f-3d2e-4ab5-b34c-215a0ff9579b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:02:49 compute-0 nova_compute[192698]: 2025-10-01 14:02:49.563 2 DEBUG nova.network.os_vif_util [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3b:f1:fb,bridge_name='br-int',has_traffic_filtering=True,id=6919878f-3d2e-4ab5-b34c-215a0ff9579b,network=Network(8b83843d-74e6-401a-9419-27491d8fece7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6919878f-3d') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:02:49 compute-0 nova_compute[192698]: 2025-10-01 14:02:49.565 2 DEBUG nova.objects.instance [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Lazy-loading 'pci_devices' on Instance uuid 6bbb3240-f185-4efc-9aaa-ed008923c68a obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:02:50 compute-0 nova_compute[192698]: 2025-10-01 14:02:50.081 2 DEBUG nova.virt.libvirt.driver [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] End _get_guest_xml xml=<domain type="kvm">
Oct 01 14:02:50 compute-0 nova_compute[192698]:   <uuid>6bbb3240-f185-4efc-9aaa-ed008923c68a</uuid>
Oct 01 14:02:50 compute-0 nova_compute[192698]:   <name>instance-00000003</name>
Oct 01 14:02:50 compute-0 nova_compute[192698]:   <memory>131072</memory>
Oct 01 14:02:50 compute-0 nova_compute[192698]:   <vcpu>1</vcpu>
Oct 01 14:02:50 compute-0 nova_compute[192698]:   <metadata>
Oct 01 14:02:50 compute-0 nova_compute[192698]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 01 14:02:50 compute-0 nova_compute[192698]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Oct 01 14:02:50 compute-0 nova_compute[192698]:       <nova:name>tempest-TestDataModel-server-1148779107</nova:name>
Oct 01 14:02:50 compute-0 nova_compute[192698]:       <nova:creationTime>2025-10-01 14:02:49</nova:creationTime>
Oct 01 14:02:50 compute-0 nova_compute[192698]:       <nova:flavor name="m1.nano" id="69702c4b-38f2-49d1-96d5-85671652c67e">
Oct 01 14:02:50 compute-0 nova_compute[192698]:         <nova:memory>128</nova:memory>
Oct 01 14:02:50 compute-0 nova_compute[192698]:         <nova:disk>1</nova:disk>
Oct 01 14:02:50 compute-0 nova_compute[192698]:         <nova:swap>0</nova:swap>
Oct 01 14:02:50 compute-0 nova_compute[192698]:         <nova:ephemeral>0</nova:ephemeral>
Oct 01 14:02:50 compute-0 nova_compute[192698]:         <nova:vcpus>1</nova:vcpus>
Oct 01 14:02:50 compute-0 nova_compute[192698]:         <nova:extraSpecs>
Oct 01 14:02:50 compute-0 nova_compute[192698]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Oct 01 14:02:50 compute-0 nova_compute[192698]:         </nova:extraSpecs>
Oct 01 14:02:50 compute-0 nova_compute[192698]:       </nova:flavor>
Oct 01 14:02:50 compute-0 nova_compute[192698]:       <nova:image uuid="48696e9b-a20d-4bf6-8ac2-6438fe748ab6">
Oct 01 14:02:50 compute-0 nova_compute[192698]:         <nova:containerFormat>bare</nova:containerFormat>
Oct 01 14:02:50 compute-0 nova_compute[192698]:         <nova:diskFormat>qcow2</nova:diskFormat>
Oct 01 14:02:50 compute-0 nova_compute[192698]:         <nova:minDisk>1</nova:minDisk>
Oct 01 14:02:50 compute-0 nova_compute[192698]:         <nova:minRam>0</nova:minRam>
Oct 01 14:02:50 compute-0 nova_compute[192698]:         <nova:properties>
Oct 01 14:02:50 compute-0 nova_compute[192698]:           <nova:property name="hw_rng_model">virtio</nova:property>
Oct 01 14:02:50 compute-0 nova_compute[192698]:         </nova:properties>
Oct 01 14:02:50 compute-0 nova_compute[192698]:       </nova:image>
Oct 01 14:02:50 compute-0 nova_compute[192698]:       <nova:owner>
Oct 01 14:02:50 compute-0 nova_compute[192698]:         <nova:user uuid="0874d87de40e4626b04c1e4b35a90268">tempest-TestDataModel-621315703-project-admin</nova:user>
Oct 01 14:02:50 compute-0 nova_compute[192698]:         <nova:project uuid="d557b5b6333c4a08801c674394739795">tempest-TestDataModel-621315703</nova:project>
Oct 01 14:02:50 compute-0 nova_compute[192698]:       </nova:owner>
Oct 01 14:02:50 compute-0 nova_compute[192698]:       <nova:root type="image" uuid="48696e9b-a20d-4bf6-8ac2-6438fe748ab6"/>
Oct 01 14:02:50 compute-0 nova_compute[192698]:       <nova:ports>
Oct 01 14:02:50 compute-0 nova_compute[192698]:         <nova:port uuid="6919878f-3d2e-4ab5-b34c-215a0ff9579b">
Oct 01 14:02:50 compute-0 nova_compute[192698]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Oct 01 14:02:50 compute-0 nova_compute[192698]:         </nova:port>
Oct 01 14:02:50 compute-0 nova_compute[192698]:       </nova:ports>
Oct 01 14:02:50 compute-0 nova_compute[192698]:     </nova:instance>
Oct 01 14:02:50 compute-0 nova_compute[192698]:   </metadata>
Oct 01 14:02:50 compute-0 nova_compute[192698]:   <sysinfo type="smbios">
Oct 01 14:02:50 compute-0 nova_compute[192698]:     <system>
Oct 01 14:02:50 compute-0 nova_compute[192698]:       <entry name="manufacturer">RDO</entry>
Oct 01 14:02:50 compute-0 nova_compute[192698]:       <entry name="product">OpenStack Compute</entry>
Oct 01 14:02:50 compute-0 nova_compute[192698]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Oct 01 14:02:50 compute-0 nova_compute[192698]:       <entry name="serial">6bbb3240-f185-4efc-9aaa-ed008923c68a</entry>
Oct 01 14:02:50 compute-0 nova_compute[192698]:       <entry name="uuid">6bbb3240-f185-4efc-9aaa-ed008923c68a</entry>
Oct 01 14:02:50 compute-0 nova_compute[192698]:       <entry name="family">Virtual Machine</entry>
Oct 01 14:02:50 compute-0 nova_compute[192698]:     </system>
Oct 01 14:02:50 compute-0 nova_compute[192698]:   </sysinfo>
Oct 01 14:02:50 compute-0 nova_compute[192698]:   <os>
Oct 01 14:02:50 compute-0 nova_compute[192698]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 01 14:02:50 compute-0 nova_compute[192698]:     <boot dev="hd"/>
Oct 01 14:02:50 compute-0 nova_compute[192698]:     <smbios mode="sysinfo"/>
Oct 01 14:02:50 compute-0 nova_compute[192698]:   </os>
Oct 01 14:02:50 compute-0 nova_compute[192698]:   <features>
Oct 01 14:02:50 compute-0 nova_compute[192698]:     <acpi/>
Oct 01 14:02:50 compute-0 nova_compute[192698]:     <apic/>
Oct 01 14:02:50 compute-0 nova_compute[192698]:     <vmcoreinfo/>
Oct 01 14:02:50 compute-0 nova_compute[192698]:   </features>
Oct 01 14:02:50 compute-0 nova_compute[192698]:   <clock offset="utc">
Oct 01 14:02:50 compute-0 nova_compute[192698]:     <timer name="pit" tickpolicy="delay"/>
Oct 01 14:02:50 compute-0 nova_compute[192698]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 01 14:02:50 compute-0 nova_compute[192698]:     <timer name="hpet" present="no"/>
Oct 01 14:02:50 compute-0 nova_compute[192698]:   </clock>
Oct 01 14:02:50 compute-0 nova_compute[192698]:   <cpu mode="host-model" match="exact">
Oct 01 14:02:50 compute-0 nova_compute[192698]:     <topology sockets="1" cores="1" threads="1"/>
Oct 01 14:02:50 compute-0 nova_compute[192698]:   </cpu>
Oct 01 14:02:50 compute-0 nova_compute[192698]:   <devices>
Oct 01 14:02:50 compute-0 nova_compute[192698]:     <disk type="file" device="disk">
Oct 01 14:02:50 compute-0 nova_compute[192698]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 01 14:02:50 compute-0 nova_compute[192698]:       <source file="/var/lib/nova/instances/6bbb3240-f185-4efc-9aaa-ed008923c68a/disk"/>
Oct 01 14:02:50 compute-0 nova_compute[192698]:       <target dev="vda" bus="virtio"/>
Oct 01 14:02:50 compute-0 nova_compute[192698]:     </disk>
Oct 01 14:02:50 compute-0 nova_compute[192698]:     <disk type="file" device="cdrom">
Oct 01 14:02:50 compute-0 nova_compute[192698]:       <driver name="qemu" type="raw" cache="none"/>
Oct 01 14:02:50 compute-0 nova_compute[192698]:       <source file="/var/lib/nova/instances/6bbb3240-f185-4efc-9aaa-ed008923c68a/disk.config"/>
Oct 01 14:02:50 compute-0 nova_compute[192698]:       <target dev="sda" bus="sata"/>
Oct 01 14:02:50 compute-0 nova_compute[192698]:     </disk>
Oct 01 14:02:50 compute-0 nova_compute[192698]:     <interface type="ethernet">
Oct 01 14:02:50 compute-0 nova_compute[192698]:       <mac address="fa:16:3e:3b:f1:fb"/>
Oct 01 14:02:50 compute-0 nova_compute[192698]:       <model type="virtio"/>
Oct 01 14:02:50 compute-0 nova_compute[192698]:       <driver name="vhost" rx_queue_size="512"/>
Oct 01 14:02:50 compute-0 nova_compute[192698]:       <mtu size="1442"/>
Oct 01 14:02:50 compute-0 nova_compute[192698]:       <target dev="tap6919878f-3d"/>
Oct 01 14:02:50 compute-0 nova_compute[192698]:     </interface>
Oct 01 14:02:50 compute-0 nova_compute[192698]:     <serial type="pty">
Oct 01 14:02:50 compute-0 nova_compute[192698]:       <log file="/var/lib/nova/instances/6bbb3240-f185-4efc-9aaa-ed008923c68a/console.log" append="off"/>
Oct 01 14:02:50 compute-0 nova_compute[192698]:     </serial>
Oct 01 14:02:50 compute-0 nova_compute[192698]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 01 14:02:50 compute-0 nova_compute[192698]:     <video>
Oct 01 14:02:50 compute-0 nova_compute[192698]:       <model type="virtio"/>
Oct 01 14:02:50 compute-0 nova_compute[192698]:     </video>
Oct 01 14:02:50 compute-0 nova_compute[192698]:     <input type="tablet" bus="usb"/>
Oct 01 14:02:50 compute-0 nova_compute[192698]:     <rng model="virtio">
Oct 01 14:02:50 compute-0 nova_compute[192698]:       <backend model="random">/dev/urandom</backend>
Oct 01 14:02:50 compute-0 nova_compute[192698]:     </rng>
Oct 01 14:02:50 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root"/>
Oct 01 14:02:50 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:02:50 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:02:50 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:02:50 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:02:50 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:02:50 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:02:50 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:02:50 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:02:50 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:02:50 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:02:50 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:02:50 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:02:50 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:02:50 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:02:50 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:02:50 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:02:50 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:02:50 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:02:50 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:02:50 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:02:50 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:02:50 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:02:50 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:02:50 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:02:50 compute-0 nova_compute[192698]:     <controller type="usb" index="0"/>
Oct 01 14:02:50 compute-0 nova_compute[192698]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Oct 01 14:02:50 compute-0 nova_compute[192698]:       <stats period="10"/>
Oct 01 14:02:50 compute-0 nova_compute[192698]:     </memballoon>
Oct 01 14:02:50 compute-0 nova_compute[192698]:   </devices>
Oct 01 14:02:50 compute-0 nova_compute[192698]: </domain>
Oct 01 14:02:50 compute-0 nova_compute[192698]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Oct 01 14:02:50 compute-0 nova_compute[192698]: 2025-10-01 14:02:50.083 2 DEBUG nova.compute.manager [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] Preparing to wait for external event network-vif-plugged-6919878f-3d2e-4ab5-b34c-215a0ff9579b prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Oct 01 14:02:50 compute-0 nova_compute[192698]: 2025-10-01 14:02:50.084 2 DEBUG oslo_concurrency.lockutils [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Acquiring lock "6bbb3240-f185-4efc-9aaa-ed008923c68a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:02:50 compute-0 nova_compute[192698]: 2025-10-01 14:02:50.084 2 DEBUG oslo_concurrency.lockutils [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Lock "6bbb3240-f185-4efc-9aaa-ed008923c68a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:02:50 compute-0 nova_compute[192698]: 2025-10-01 14:02:50.085 2 DEBUG oslo_concurrency.lockutils [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Lock "6bbb3240-f185-4efc-9aaa-ed008923c68a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:02:50 compute-0 nova_compute[192698]: 2025-10-01 14:02:50.086 2 DEBUG nova.virt.libvirt.vif [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-01T14:02:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestDataModel-server-1148779107',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testdatamodel-server-1148779107',id=3,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d557b5b6333c4a08801c674394739795',ramdisk_id='',reservation_id='r-nh11smmq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestDataModel-621315703',owner_user_name='tempest-TestDataModel-621315703-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-01T14:02:42Z,user_data=None,user_id='0874d87de40e4626b04c1e4b35a90268',uuid=6bbb3240-f185-4efc-9aaa-ed008923c68a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6919878f-3d2e-4ab5-b34c-215a0ff9579b", "address": "fa:16:3e:3b:f1:fb", "network": {"id": "8b83843d-74e6-401a-9419-27491d8fece7", "bridge": "br-int", "label": "tempest-TestDataModel-780178721-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "18a5e4cbf5004eb69c4a4632324f35d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6919878f-3d", "ovs_interfaceid": "6919878f-3d2e-4ab5-b34c-215a0ff9579b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 01 14:02:50 compute-0 nova_compute[192698]: 2025-10-01 14:02:50.086 2 DEBUG nova.network.os_vif_util [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Converting VIF {"id": "6919878f-3d2e-4ab5-b34c-215a0ff9579b", "address": "fa:16:3e:3b:f1:fb", "network": {"id": "8b83843d-74e6-401a-9419-27491d8fece7", "bridge": "br-int", "label": "tempest-TestDataModel-780178721-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "18a5e4cbf5004eb69c4a4632324f35d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6919878f-3d", "ovs_interfaceid": "6919878f-3d2e-4ab5-b34c-215a0ff9579b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:02:50 compute-0 nova_compute[192698]: 2025-10-01 14:02:50.087 2 DEBUG nova.network.os_vif_util [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3b:f1:fb,bridge_name='br-int',has_traffic_filtering=True,id=6919878f-3d2e-4ab5-b34c-215a0ff9579b,network=Network(8b83843d-74e6-401a-9419-27491d8fece7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6919878f-3d') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:02:50 compute-0 nova_compute[192698]: 2025-10-01 14:02:50.088 2 DEBUG os_vif [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3b:f1:fb,bridge_name='br-int',has_traffic_filtering=True,id=6919878f-3d2e-4ab5-b34c-215a0ff9579b,network=Network(8b83843d-74e6-401a-9419-27491d8fece7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6919878f-3d') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 01 14:02:50 compute-0 nova_compute[192698]: 2025-10-01 14:02:50.142 2 DEBUG ovsdbapp.backend.ovs_idl [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 01 14:02:50 compute-0 nova_compute[192698]: 2025-10-01 14:02:50.142 2 DEBUG ovsdbapp.backend.ovs_idl [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 01 14:02:50 compute-0 nova_compute[192698]: 2025-10-01 14:02:50.142 2 DEBUG ovsdbapp.backend.ovs_idl [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Oct 01 14:02:50 compute-0 nova_compute[192698]: 2025-10-01 14:02:50.143 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Oct 01 14:02:50 compute-0 nova_compute[192698]: 2025-10-01 14:02:50.143 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] [POLLOUT] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:02:50 compute-0 nova_compute[192698]: 2025-10-01 14:02:50.144 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Oct 01 14:02:50 compute-0 nova_compute[192698]: 2025-10-01 14:02:50.144 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:02:50 compute-0 nova_compute[192698]: 2025-10-01 14:02:50.146 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:02:50 compute-0 nova_compute[192698]: 2025-10-01 14:02:50.149 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:02:50 compute-0 nova_compute[192698]: 2025-10-01 14:02:50.156 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:02:50 compute-0 nova_compute[192698]: 2025-10-01 14:02:50.157 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:02:50 compute-0 nova_compute[192698]: 2025-10-01 14:02:50.157 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:02:50 compute-0 nova_compute[192698]: 2025-10-01 14:02:50.158 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:02:50 compute-0 nova_compute[192698]: 2025-10-01 14:02:50.158 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '2b5253ce-349d-5ae8-bd87-d5b1e98ac2d7', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:02:50 compute-0 nova_compute[192698]: 2025-10-01 14:02:50.160 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:02:50 compute-0 nova_compute[192698]: 2025-10-01 14:02:50.162 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:02:50 compute-0 nova_compute[192698]: 2025-10-01 14:02:50.163 2 INFO oslo.privsep.daemon [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpeifajg0a/privsep.sock']
Oct 01 14:02:50 compute-0 nova_compute[192698]: 2025-10-01 14:02:50.953 2 INFO oslo.privsep.daemon [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Spawned new privsep daemon via rootwrap
Oct 01 14:02:50 compute-0 nova_compute[192698]: 2025-10-01 14:02:50.783 86 INFO oslo.privsep.daemon [-] privsep daemon starting
Oct 01 14:02:50 compute-0 nova_compute[192698]: 2025-10-01 14:02:50.790 86 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Oct 01 14:02:50 compute-0 nova_compute[192698]: 2025-10-01 14:02:50.794 86 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Oct 01 14:02:50 compute-0 nova_compute[192698]: 2025-10-01 14:02:50.794 86 INFO oslo.privsep.daemon [-] privsep daemon running as pid 86
Oct 01 14:02:51 compute-0 podman[215677]: 2025-10-01 14:02:51.190022814 +0000 UTC m=+0.097482054 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_id=iscsid, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true)
Oct 01 14:02:51 compute-0 podman[215678]: 2025-10-01 14:02:51.205724757 +0000 UTC m=+0.106249019 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, config_id=multipathd)
Oct 01 14:02:51 compute-0 nova_compute[192698]: 2025-10-01 14:02:51.215 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:02:51 compute-0 nova_compute[192698]: 2025-10-01 14:02:51.216 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6919878f-3d, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:02:51 compute-0 nova_compute[192698]: 2025-10-01 14:02:51.216 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap6919878f-3d, col_values=(('qos', UUID('8800eba0-908d-401a-b2eb-aa2b78e23043')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:02:51 compute-0 nova_compute[192698]: 2025-10-01 14:02:51.218 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap6919878f-3d, col_values=(('external_ids', {'iface-id': '6919878f-3d2e-4ab5-b34c-215a0ff9579b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3b:f1:fb', 'vm-uuid': '6bbb3240-f185-4efc-9aaa-ed008923c68a'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:02:51 compute-0 nova_compute[192698]: 2025-10-01 14:02:51.220 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:02:51 compute-0 nova_compute[192698]: 2025-10-01 14:02:51.222 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:02:51 compute-0 NetworkManager[51741]: <info>  [1759327371.2227] manager: (tap6919878f-3d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/22)
Oct 01 14:02:51 compute-0 nova_compute[192698]: 2025-10-01 14:02:51.231 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:02:51 compute-0 nova_compute[192698]: 2025-10-01 14:02:51.232 2 INFO os_vif [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3b:f1:fb,bridge_name='br-int',has_traffic_filtering=True,id=6919878f-3d2e-4ab5-b34c-215a0ff9579b,network=Network(8b83843d-74e6-401a-9419-27491d8fece7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6919878f-3d')
Oct 01 14:02:51 compute-0 nova_compute[192698]: 2025-10-01 14:02:51.963 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:02:52 compute-0 nova_compute[192698]: 2025-10-01 14:02:52.783 2 DEBUG nova.virt.libvirt.driver [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 01 14:02:52 compute-0 nova_compute[192698]: 2025-10-01 14:02:52.783 2 DEBUG nova.virt.libvirt.driver [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 01 14:02:52 compute-0 nova_compute[192698]: 2025-10-01 14:02:52.784 2 DEBUG nova.virt.libvirt.driver [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] No VIF found with MAC fa:16:3e:3b:f1:fb, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Oct 01 14:02:52 compute-0 nova_compute[192698]: 2025-10-01 14:02:52.785 2 INFO nova.virt.libvirt.driver [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] Using config drive
Oct 01 14:02:53 compute-0 nova_compute[192698]: 2025-10-01 14:02:53.298 2 WARNING neutronclient.v2_0.client [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:02:54 compute-0 nova_compute[192698]: 2025-10-01 14:02:54.184 2 INFO nova.virt.libvirt.driver [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] Creating config drive at /var/lib/nova/instances/6bbb3240-f185-4efc-9aaa-ed008923c68a/disk.config
Oct 01 14:02:54 compute-0 nova_compute[192698]: 2025-10-01 14:02:54.193 2 DEBUG oslo_concurrency.processutils [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6bbb3240-f185-4efc-9aaa-ed008923c68a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpjhkvrr7q execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:02:54 compute-0 nova_compute[192698]: 2025-10-01 14:02:54.339 2 DEBUG oslo_concurrency.processutils [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6bbb3240-f185-4efc-9aaa-ed008923c68a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpjhkvrr7q" returned: 0 in 0.146s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:02:54 compute-0 kernel: tun: Universal TUN/TAP device driver, 1.6
Oct 01 14:02:54 compute-0 kernel: tap6919878f-3d: entered promiscuous mode
Oct 01 14:02:54 compute-0 NetworkManager[51741]: <info>  [1759327374.4701] manager: (tap6919878f-3d): new Tun device (/org/freedesktop/NetworkManager/Devices/23)
Oct 01 14:02:54 compute-0 ovn_controller[94909]: 2025-10-01T14:02:54Z|00040|binding|INFO|Claiming lport 6919878f-3d2e-4ab5-b34c-215a0ff9579b for this chassis.
Oct 01 14:02:54 compute-0 ovn_controller[94909]: 2025-10-01T14:02:54Z|00041|binding|INFO|6919878f-3d2e-4ab5-b34c-215a0ff9579b: Claiming fa:16:3e:3b:f1:fb 10.100.0.9
Oct 01 14:02:54 compute-0 nova_compute[192698]: 2025-10-01 14:02:54.471 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:02:54 compute-0 nova_compute[192698]: 2025-10-01 14:02:54.482 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:02:54 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:02:54.504 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3b:f1:fb 10.100.0.9'], port_security=['fa:16:3e:3b:f1:fb 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '6bbb3240-f185-4efc-9aaa-ed008923c68a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8b83843d-74e6-401a-9419-27491d8fece7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd557b5b6333c4a08801c674394739795', 'neutron:revision_number': '4', 'neutron:security_group_ids': '980a288f-8f12-4b4d-bb44-f026855382c6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=74ce129b-0e7f-4161-8565-94672e8a679d, chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], logical_port=6919878f-3d2e-4ab5-b34c-215a0ff9579b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:02:54 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:02:54.505 103791 INFO neutron.agent.ovn.metadata.agent [-] Port 6919878f-3d2e-4ab5-b34c-215a0ff9579b in datapath 8b83843d-74e6-401a-9419-27491d8fece7 bound to our chassis
Oct 01 14:02:54 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:02:54.507 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8b83843d-74e6-401a-9419-27491d8fece7
Oct 01 14:02:54 compute-0 systemd-udevd[215734]: Network interface NamePolicy= disabled on kernel command line.
Oct 01 14:02:54 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:02:54.542 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[9b64ea44-135d-423d-b2fc-5303a486d5c6]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:02:54 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:02:54.543 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8b83843d-71 in ovnmeta-8b83843d-74e6-401a-9419-27491d8fece7 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Oct 01 14:02:54 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:02:54.547 214114 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8b83843d-70 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Oct 01 14:02:54 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:02:54.547 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[ac45973d-0a06-40d7-8c88-b48a36ff8968]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:02:54 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:02:54.548 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[f19fd9ed-37ff-4076-9829-3f05d63283c5]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:02:54 compute-0 NetworkManager[51741]: <info>  [1759327374.5607] device (tap6919878f-3d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 01 14:02:54 compute-0 NetworkManager[51741]: <info>  [1759327374.5624] device (tap6919878f-3d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 01 14:02:54 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:02:54.576 103910 DEBUG oslo.privsep.daemon [-] privsep: reply[2c930838-5994-4a10-9718-fde10f0da0a8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:02:54 compute-0 systemd-machined[152704]: New machine qemu-1-instance-00000003.
Oct 01 14:02:54 compute-0 nova_compute[192698]: 2025-10-01 14:02:54.597 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:02:54 compute-0 systemd[1]: Started Virtual Machine qemu-1-instance-00000003.
Oct 01 14:02:54 compute-0 ovn_controller[94909]: 2025-10-01T14:02:54Z|00042|binding|INFO|Setting lport 6919878f-3d2e-4ab5-b34c-215a0ff9579b ovn-installed in OVS
Oct 01 14:02:54 compute-0 ovn_controller[94909]: 2025-10-01T14:02:54Z|00043|binding|INFO|Setting lport 6919878f-3d2e-4ab5-b34c-215a0ff9579b up in Southbound
Oct 01 14:02:54 compute-0 nova_compute[192698]: 2025-10-01 14:02:54.606 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:02:54 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:02:54.608 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[fa74504e-6aea-46d1-a8f7-c287260801d8]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:02:54 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:02:54.610 103791 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmplkc2jle1/privsep.sock']
Oct 01 14:02:54 compute-0 nova_compute[192698]: 2025-10-01 14:02:54.813 2 DEBUG nova.compute.manager [req-f9fdceef-ce99-482c-b905-49373a0c49dc req-c3729015-f2d6-4827-afce-c4f0f44875ec 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] Received event network-vif-plugged-6919878f-3d2e-4ab5-b34c-215a0ff9579b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:02:54 compute-0 nova_compute[192698]: 2025-10-01 14:02:54.816 2 DEBUG oslo_concurrency.lockutils [req-f9fdceef-ce99-482c-b905-49373a0c49dc req-c3729015-f2d6-4827-afce-c4f0f44875ec 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "6bbb3240-f185-4efc-9aaa-ed008923c68a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:02:54 compute-0 nova_compute[192698]: 2025-10-01 14:02:54.816 2 DEBUG oslo_concurrency.lockutils [req-f9fdceef-ce99-482c-b905-49373a0c49dc req-c3729015-f2d6-4827-afce-c4f0f44875ec 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "6bbb3240-f185-4efc-9aaa-ed008923c68a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:02:54 compute-0 nova_compute[192698]: 2025-10-01 14:02:54.817 2 DEBUG oslo_concurrency.lockutils [req-f9fdceef-ce99-482c-b905-49373a0c49dc req-c3729015-f2d6-4827-afce-c4f0f44875ec 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "6bbb3240-f185-4efc-9aaa-ed008923c68a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:02:54 compute-0 nova_compute[192698]: 2025-10-01 14:02:54.817 2 DEBUG nova.compute.manager [req-f9fdceef-ce99-482c-b905-49373a0c49dc req-c3729015-f2d6-4827-afce-c4f0f44875ec 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] Processing event network-vif-plugged-6919878f-3d2e-4ab5-b34c-215a0ff9579b _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Oct 01 14:02:55 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:02:55.401 103791 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Oct 01 14:02:55 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:02:55.402 103791 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmplkc2jle1/privsep.sock __init__ /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:377
Oct 01 14:02:55 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:02:55.245 215767 INFO oslo.privsep.daemon [-] privsep daemon starting
Oct 01 14:02:55 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:02:55.250 215767 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Oct 01 14:02:55 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:02:55.252 215767 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Oct 01 14:02:55 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:02:55.253 215767 INFO oslo.privsep.daemon [-] privsep daemon running as pid 215767
Oct 01 14:02:55 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:02:55.404 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[47c84144-f3b7-4c22-af97-15b73b16e94b]: (2,) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:02:55 compute-0 nova_compute[192698]: 2025-10-01 14:02:55.523 2 DEBUG nova.compute.manager [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Oct 01 14:02:55 compute-0 nova_compute[192698]: 2025-10-01 14:02:55.542 2 DEBUG nova.virt.libvirt.driver [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Oct 01 14:02:55 compute-0 nova_compute[192698]: 2025-10-01 14:02:55.548 2 INFO nova.virt.libvirt.driver [-] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] Instance spawned successfully.
Oct 01 14:02:55 compute-0 nova_compute[192698]: 2025-10-01 14:02:55.549 2 DEBUG nova.virt.libvirt.driver [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Oct 01 14:02:55 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:02:55.929 215767 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:02:55 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:02:55.929 215767 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:02:55 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:02:55.929 215767 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:02:56 compute-0 nova_compute[192698]: 2025-10-01 14:02:56.068 2 DEBUG nova.virt.libvirt.driver [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:02:56 compute-0 nova_compute[192698]: 2025-10-01 14:02:56.069 2 DEBUG nova.virt.libvirt.driver [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:02:56 compute-0 nova_compute[192698]: 2025-10-01 14:02:56.070 2 DEBUG nova.virt.libvirt.driver [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:02:56 compute-0 nova_compute[192698]: 2025-10-01 14:02:56.071 2 DEBUG nova.virt.libvirt.driver [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:02:56 compute-0 nova_compute[192698]: 2025-10-01 14:02:56.072 2 DEBUG nova.virt.libvirt.driver [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:02:56 compute-0 nova_compute[192698]: 2025-10-01 14:02:56.073 2 DEBUG nova.virt.libvirt.driver [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:02:56 compute-0 nova_compute[192698]: 2025-10-01 14:02:56.220 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:02:56 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:02:56.450 215767 INFO oslo_service.backend [-] Loading backend: eventlet
Oct 01 14:02:56 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:02:56.456 215767 INFO oslo_service.backend [-] Backend 'eventlet' successfully loaded and cached.
Oct 01 14:02:56 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:02:56.554 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[8a3d1512-cd8e-4863-9c6e-0bc4ac39dd94]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:02:56 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:02:56.562 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[2a619e4a-e978-4405-98aa-2a7889067cec]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:02:56 compute-0 systemd-udevd[215739]: Network interface NamePolicy= disabled on kernel command line.
Oct 01 14:02:56 compute-0 NetworkManager[51741]: <info>  [1759327376.5687] manager: (tap8b83843d-70): new Veth device (/org/freedesktop/NetworkManager/Devices/24)
Oct 01 14:02:56 compute-0 nova_compute[192698]: 2025-10-01 14:02:56.594 2 INFO nova.compute.manager [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] Took 13.05 seconds to spawn the instance on the hypervisor.
Oct 01 14:02:56 compute-0 nova_compute[192698]: 2025-10-01 14:02:56.596 2 DEBUG nova.compute.manager [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 01 14:02:56 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:02:56.624 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[6e759630-f8b9-451e-a1d1-b19a7925de7b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:02:56 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:02:56.629 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[1b6f86df-621e-4946-9af3-85666c3aad6f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:02:56 compute-0 NetworkManager[51741]: <info>  [1759327376.6709] device (tap8b83843d-70): carrier: link connected
Oct 01 14:02:56 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:02:56.682 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[49ab5cee-1273-494f-b64b-92120d5ed45b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:02:56 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:02:56.714 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[061b814e-6df5-4012-81a5-8028993ef1c9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8b83843d-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4a:8b:50'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 373720, 'reachable_time': 18889, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215788, 'error': None, 'target': 'ovnmeta-8b83843d-74e6-401a-9419-27491d8fece7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:02:56 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:02:56.748 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[45cc8dae-7c84-45d6-81e3-bee0e71cc7b0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe4a:8b50'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 373720, 'tstamp': 373720}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215789, 'error': None, 'target': 'ovnmeta-8b83843d-74e6-401a-9419-27491d8fece7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:02:56 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:02:56.778 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[58b56d28-be48-4bbf-a9a4-531ac7809259]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8b83843d-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4a:8b:50'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 373720, 'reachable_time': 18889, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215790, 'error': None, 'target': 'ovnmeta-8b83843d-74e6-401a-9419-27491d8fece7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:02:56 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:02:56.839 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[5465bf4c-2561-4761-8161-7cea888179d7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:02:56 compute-0 nova_compute[192698]: 2025-10-01 14:02:56.873 2 DEBUG nova.compute.manager [req-dd191361-4246-4ad3-96f1-ee40bc4be605 req-eaba33c6-e316-4e38-bcc9-43c0a966ec1a 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] Received event network-vif-plugged-6919878f-3d2e-4ab5-b34c-215a0ff9579b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:02:56 compute-0 nova_compute[192698]: 2025-10-01 14:02:56.874 2 DEBUG oslo_concurrency.lockutils [req-dd191361-4246-4ad3-96f1-ee40bc4be605 req-eaba33c6-e316-4e38-bcc9-43c0a966ec1a 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "6bbb3240-f185-4efc-9aaa-ed008923c68a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:02:56 compute-0 nova_compute[192698]: 2025-10-01 14:02:56.874 2 DEBUG oslo_concurrency.lockutils [req-dd191361-4246-4ad3-96f1-ee40bc4be605 req-eaba33c6-e316-4e38-bcc9-43c0a966ec1a 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "6bbb3240-f185-4efc-9aaa-ed008923c68a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:02:56 compute-0 nova_compute[192698]: 2025-10-01 14:02:56.874 2 DEBUG oslo_concurrency.lockutils [req-dd191361-4246-4ad3-96f1-ee40bc4be605 req-eaba33c6-e316-4e38-bcc9-43c0a966ec1a 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "6bbb3240-f185-4efc-9aaa-ed008923c68a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:02:56 compute-0 nova_compute[192698]: 2025-10-01 14:02:56.875 2 DEBUG nova.compute.manager [req-dd191361-4246-4ad3-96f1-ee40bc4be605 req-eaba33c6-e316-4e38-bcc9-43c0a966ec1a 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] No waiting events found dispatching network-vif-plugged-6919878f-3d2e-4ab5-b34c-215a0ff9579b pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:02:56 compute-0 nova_compute[192698]: 2025-10-01 14:02:56.875 2 WARNING nova.compute.manager [req-dd191361-4246-4ad3-96f1-ee40bc4be605 req-eaba33c6-e316-4e38-bcc9-43c0a966ec1a 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] Received unexpected event network-vif-plugged-6919878f-3d2e-4ab5-b34c-215a0ff9579b for instance with vm_state active and task_state None.
Oct 01 14:02:56 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:02:56.944 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[1d6cf67b-e904-423b-af19-4ca6ef813f1c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:02:56 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:02:56.947 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8b83843d-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:02:56 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:02:56.948 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:02:56 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:02:56.948 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8b83843d-70, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:02:56 compute-0 kernel: tap8b83843d-70: entered promiscuous mode
Oct 01 14:02:56 compute-0 NetworkManager[51741]: <info>  [1759327376.9888] manager: (tap8b83843d-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/25)
Oct 01 14:02:56 compute-0 nova_compute[192698]: 2025-10-01 14:02:56.988 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:02:56 compute-0 nova_compute[192698]: 2025-10-01 14:02:56.991 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:02:56 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:02:56.993 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8b83843d-70, col_values=(('external_ids', {'iface-id': 'bba308e4-f061-4ef4-89aa-6683f56ddb01'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:02:56 compute-0 nova_compute[192698]: 2025-10-01 14:02:56.995 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:02:56 compute-0 ovn_controller[94909]: 2025-10-01T14:02:56Z|00044|binding|INFO|Releasing lport bba308e4-f061-4ef4-89aa-6683f56ddb01 from this chassis (sb_readonly=0)
Oct 01 14:02:56 compute-0 nova_compute[192698]: 2025-10-01 14:02:56.996 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:02:56 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:02:56.998 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[43574829-5dcc-48a4-b9ac-a7840a7f04d6]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:02:56 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:02:56.999 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8b83843d-74e6-401a-9419-27491d8fece7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8b83843d-74e6-401a-9419-27491d8fece7.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:02:57 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:02:56.999 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8b83843d-74e6-401a-9419-27491d8fece7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8b83843d-74e6-401a-9419-27491d8fece7.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:02:57 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:02:56.999 103791 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 8b83843d-74e6-401a-9419-27491d8fece7 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Oct 01 14:02:57 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:02:57.000 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8b83843d-74e6-401a-9419-27491d8fece7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8b83843d-74e6-401a-9419-27491d8fece7.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:02:57 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:02:57.000 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[71c03e42-c8bf-42ed-a489-69f13efc38a5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:02:57 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:02:57.001 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8b83843d-74e6-401a-9419-27491d8fece7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8b83843d-74e6-401a-9419-27491d8fece7.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:02:57 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:02:57.001 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[47a2f5c9-e232-4ee5-a5b3-26167c423e16]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:02:57 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:02:57.002 103791 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Oct 01 14:02:57 compute-0 ovn_metadata_agent[103777]: global
Oct 01 14:02:57 compute-0 ovn_metadata_agent[103777]:     log         /dev/log local0 debug
Oct 01 14:02:57 compute-0 ovn_metadata_agent[103777]:     log-tag     haproxy-metadata-proxy-8b83843d-74e6-401a-9419-27491d8fece7
Oct 01 14:02:57 compute-0 ovn_metadata_agent[103777]:     user        root
Oct 01 14:02:57 compute-0 ovn_metadata_agent[103777]:     group       root
Oct 01 14:02:57 compute-0 ovn_metadata_agent[103777]:     maxconn     1024
Oct 01 14:02:57 compute-0 ovn_metadata_agent[103777]:     pidfile     /var/lib/neutron/external/pids/8b83843d-74e6-401a-9419-27491d8fece7.pid.haproxy
Oct 01 14:02:57 compute-0 ovn_metadata_agent[103777]:     daemon
Oct 01 14:02:57 compute-0 ovn_metadata_agent[103777]: 
Oct 01 14:02:57 compute-0 ovn_metadata_agent[103777]: defaults
Oct 01 14:02:57 compute-0 ovn_metadata_agent[103777]:     log global
Oct 01 14:02:57 compute-0 ovn_metadata_agent[103777]:     mode http
Oct 01 14:02:57 compute-0 ovn_metadata_agent[103777]:     option httplog
Oct 01 14:02:57 compute-0 ovn_metadata_agent[103777]:     option dontlognull
Oct 01 14:02:57 compute-0 ovn_metadata_agent[103777]:     option http-server-close
Oct 01 14:02:57 compute-0 ovn_metadata_agent[103777]:     option forwardfor
Oct 01 14:02:57 compute-0 ovn_metadata_agent[103777]:     retries                 3
Oct 01 14:02:57 compute-0 ovn_metadata_agent[103777]:     timeout http-request    30s
Oct 01 14:02:57 compute-0 ovn_metadata_agent[103777]:     timeout connect         30s
Oct 01 14:02:57 compute-0 ovn_metadata_agent[103777]:     timeout client          32s
Oct 01 14:02:57 compute-0 ovn_metadata_agent[103777]:     timeout server          32s
Oct 01 14:02:57 compute-0 ovn_metadata_agent[103777]:     timeout http-keep-alive 30s
Oct 01 14:02:57 compute-0 ovn_metadata_agent[103777]: 
Oct 01 14:02:57 compute-0 ovn_metadata_agent[103777]: listen listener
Oct 01 14:02:57 compute-0 ovn_metadata_agent[103777]:     bind 169.254.169.254:80
Oct 01 14:02:57 compute-0 ovn_metadata_agent[103777]:     
Oct 01 14:02:57 compute-0 ovn_metadata_agent[103777]:     server metadata /var/lib/neutron/metadata_proxy
Oct 01 14:02:57 compute-0 ovn_metadata_agent[103777]: 
Oct 01 14:02:57 compute-0 ovn_metadata_agent[103777]:     http-request add-header X-OVN-Network-ID 8b83843d-74e6-401a-9419-27491d8fece7
Oct 01 14:02:57 compute-0 ovn_metadata_agent[103777]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Oct 01 14:02:57 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:02:57.003 103791 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8b83843d-74e6-401a-9419-27491d8fece7', 'env', 'PROCESS_TAG=haproxy-8b83843d-74e6-401a-9419-27491d8fece7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8b83843d-74e6-401a-9419-27491d8fece7.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Oct 01 14:02:57 compute-0 nova_compute[192698]: 2025-10-01 14:02:57.007 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:02:57 compute-0 nova_compute[192698]: 2025-10-01 14:02:57.146 2 INFO nova.compute.manager [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] Took 18.40 seconds to build instance.
Oct 01 14:02:57 compute-0 podman[215823]: 2025-10-01 14:02:57.484013335 +0000 UTC m=+0.072724214 container create 58f3385a49e28e004626b1c7cc6230d42fa70833fc6142a61786a5e9f232bcdb (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-8b83843d-74e6-401a-9419-27491d8fece7, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct 01 14:02:57 compute-0 systemd[1]: Started libpod-conmon-58f3385a49e28e004626b1c7cc6230d42fa70833fc6142a61786a5e9f232bcdb.scope.
Oct 01 14:02:57 compute-0 podman[215823]: 2025-10-01 14:02:57.443550563 +0000 UTC m=+0.032261472 image pull 0c139338a67144a0d88e07ef5f38b20d3085af4a1586fd8115d3776c8f9c633c 38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Oct 01 14:02:57 compute-0 systemd[1]: Started libcrun container.
Oct 01 14:02:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0db12906ce462c69e050dd076edb88eca5029e72b790b939833886dd78df0e33/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 01 14:02:57 compute-0 podman[215823]: 2025-10-01 14:02:57.567938972 +0000 UTC m=+0.156649941 container init 58f3385a49e28e004626b1c7cc6230d42fa70833fc6142a61786a5e9f232bcdb (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-8b83843d-74e6-401a-9419-27491d8fece7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20250930)
Oct 01 14:02:57 compute-0 podman[215823]: 2025-10-01 14:02:57.580600293 +0000 UTC m=+0.169311202 container start 58f3385a49e28e004626b1c7cc6230d42fa70833fc6142a61786a5e9f232bcdb (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-8b83843d-74e6-401a-9419-27491d8fece7, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Oct 01 14:02:57 compute-0 neutron-haproxy-ovnmeta-8b83843d-74e6-401a-9419-27491d8fece7[215839]: [NOTICE]   (215843) : New worker (215845) forked
Oct 01 14:02:57 compute-0 neutron-haproxy-ovnmeta-8b83843d-74e6-401a-9419-27491d8fece7[215839]: [NOTICE]   (215843) : Loading success.
Oct 01 14:02:57 compute-0 nova_compute[192698]: 2025-10-01 14:02:57.651 2 DEBUG oslo_concurrency.lockutils [None req-c6ab0585-c5bf-47cb-b870-ce8a61efe280 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Lock "6bbb3240-f185-4efc-9aaa-ed008923c68a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.926s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:02:58 compute-0 nova_compute[192698]: 2025-10-01 14:02:58.926 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:02:58 compute-0 nova_compute[192698]: 2025-10-01 14:02:58.927 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Oct 01 14:02:59 compute-0 podman[215854]: 2025-10-01 14:02:59.187616625 +0000 UTC m=+0.096876947 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 01 14:02:59 compute-0 podman[203144]: time="2025-10-01T14:02:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:02:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:02:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20749 "" "Go-http-client/1.1"
Oct 01 14:02:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:02:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3449 "" "Go-http-client/1.1"
Oct 01 14:03:01 compute-0 nova_compute[192698]: 2025-10-01 14:03:01.224 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:03:01 compute-0 openstack_network_exporter[205307]: ERROR   14:03:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:03:01 compute-0 openstack_network_exporter[205307]: ERROR   14:03:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:03:01 compute-0 openstack_network_exporter[205307]: ERROR   14:03:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:03:01 compute-0 openstack_network_exporter[205307]: ERROR   14:03:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:03:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:03:01 compute-0 openstack_network_exporter[205307]: ERROR   14:03:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:03:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:03:01 compute-0 nova_compute[192698]: 2025-10-01 14:03:01.888 2 DEBUG oslo_concurrency.lockutils [None req-9c0b4ebc-434f-446a-b010-7e88b1dfe8e5 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Acquiring lock "6bbb3240-f185-4efc-9aaa-ed008923c68a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:03:01 compute-0 nova_compute[192698]: 2025-10-01 14:03:01.889 2 DEBUG oslo_concurrency.lockutils [None req-9c0b4ebc-434f-446a-b010-7e88b1dfe8e5 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Lock "6bbb3240-f185-4efc-9aaa-ed008923c68a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:03:01 compute-0 nova_compute[192698]: 2025-10-01 14:03:01.889 2 DEBUG oslo_concurrency.lockutils [None req-9c0b4ebc-434f-446a-b010-7e88b1dfe8e5 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Acquiring lock "6bbb3240-f185-4efc-9aaa-ed008923c68a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:03:01 compute-0 nova_compute[192698]: 2025-10-01 14:03:01.890 2 DEBUG oslo_concurrency.lockutils [None req-9c0b4ebc-434f-446a-b010-7e88b1dfe8e5 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Lock "6bbb3240-f185-4efc-9aaa-ed008923c68a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:03:01 compute-0 nova_compute[192698]: 2025-10-01 14:03:01.890 2 DEBUG oslo_concurrency.lockutils [None req-9c0b4ebc-434f-446a-b010-7e88b1dfe8e5 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Lock "6bbb3240-f185-4efc-9aaa-ed008923c68a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:03:01 compute-0 nova_compute[192698]: 2025-10-01 14:03:01.906 2 INFO nova.compute.manager [None req-9c0b4ebc-434f-446a-b010-7e88b1dfe8e5 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] Terminating instance
Oct 01 14:03:01 compute-0 nova_compute[192698]: 2025-10-01 14:03:01.926 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:03:02 compute-0 nova_compute[192698]: 2025-10-01 14:03:02.030 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:03:02 compute-0 nova_compute[192698]: 2025-10-01 14:03:02.427 2 DEBUG nova.compute.manager [None req-9c0b4ebc-434f-446a-b010-7e88b1dfe8e5 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 01 14:03:02 compute-0 kernel: tap6919878f-3d (unregistering): left promiscuous mode
Oct 01 14:03:02 compute-0 NetworkManager[51741]: <info>  [1759327382.4555] device (tap6919878f-3d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 01 14:03:02 compute-0 ovn_controller[94909]: 2025-10-01T14:03:02Z|00045|binding|INFO|Releasing lport 6919878f-3d2e-4ab5-b34c-215a0ff9579b from this chassis (sb_readonly=0)
Oct 01 14:03:02 compute-0 ovn_controller[94909]: 2025-10-01T14:03:02Z|00046|binding|INFO|Setting lport 6919878f-3d2e-4ab5-b34c-215a0ff9579b down in Southbound
Oct 01 14:03:02 compute-0 ovn_controller[94909]: 2025-10-01T14:03:02Z|00047|binding|INFO|Removing iface tap6919878f-3d ovn-installed in OVS
Oct 01 14:03:02 compute-0 nova_compute[192698]: 2025-10-01 14:03:02.473 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:03:02 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:03:02.481 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3b:f1:fb 10.100.0.9'], port_security=['fa:16:3e:3b:f1:fb 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '6bbb3240-f185-4efc-9aaa-ed008923c68a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8b83843d-74e6-401a-9419-27491d8fece7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd557b5b6333c4a08801c674394739795', 'neutron:revision_number': '5', 'neutron:security_group_ids': '980a288f-8f12-4b4d-bb44-f026855382c6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=74ce129b-0e7f-4161-8565-94672e8a679d, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], logical_port=6919878f-3d2e-4ab5-b34c-215a0ff9579b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:03:02 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:03:02.483 103791 INFO neutron.agent.ovn.metadata.agent [-] Port 6919878f-3d2e-4ab5-b34c-215a0ff9579b in datapath 8b83843d-74e6-401a-9419-27491d8fece7 unbound from our chassis
Oct 01 14:03:02 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:03:02.484 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8b83843d-74e6-401a-9419-27491d8fece7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 01 14:03:02 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:03:02.487 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[d93c6e82-5afc-49eb-9bb1-908401e745c9]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:03:02 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:03:02.488 103791 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8b83843d-74e6-401a-9419-27491d8fece7 namespace which is not needed anymore
Oct 01 14:03:02 compute-0 nova_compute[192698]: 2025-10-01 14:03:02.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:03:02 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000003.scope: Deactivated successfully.
Oct 01 14:03:02 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000003.scope: Consumed 7.865s CPU time.
Oct 01 14:03:02 compute-0 systemd-machined[152704]: Machine qemu-1-instance-00000003 terminated.
Oct 01 14:03:02 compute-0 neutron-haproxy-ovnmeta-8b83843d-74e6-401a-9419-27491d8fece7[215839]: [NOTICE]   (215843) : haproxy version is 3.0.5-8e879a5
Oct 01 14:03:02 compute-0 neutron-haproxy-ovnmeta-8b83843d-74e6-401a-9419-27491d8fece7[215839]: [NOTICE]   (215843) : path to executable is /usr/sbin/haproxy
Oct 01 14:03:02 compute-0 neutron-haproxy-ovnmeta-8b83843d-74e6-401a-9419-27491d8fece7[215839]: [WARNING]  (215843) : Exiting Master process...
Oct 01 14:03:02 compute-0 podman[215902]: 2025-10-01 14:03:02.629662822 +0000 UTC m=+0.045159510 container kill 58f3385a49e28e004626b1c7cc6230d42fa70833fc6142a61786a5e9f232bcdb (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-8b83843d-74e6-401a-9419-27491d8fece7, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Oct 01 14:03:02 compute-0 neutron-haproxy-ovnmeta-8b83843d-74e6-401a-9419-27491d8fece7[215839]: [ALERT]    (215843) : Current worker (215845) exited with code 143 (Terminated)
Oct 01 14:03:02 compute-0 neutron-haproxy-ovnmeta-8b83843d-74e6-401a-9419-27491d8fece7[215839]: [WARNING]  (215843) : All workers exited. Exiting... (0)
Oct 01 14:03:02 compute-0 systemd[1]: libpod-58f3385a49e28e004626b1c7cc6230d42fa70833fc6142a61786a5e9f232bcdb.scope: Deactivated successfully.
Oct 01 14:03:02 compute-0 podman[215917]: 2025-10-01 14:03:02.686641761 +0000 UTC m=+0.037692749 container died 58f3385a49e28e004626b1c7cc6230d42fa70833fc6142a61786a5e9f232bcdb (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-8b83843d-74e6-401a-9419-27491d8fece7, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Oct 01 14:03:02 compute-0 nova_compute[192698]: 2025-10-01 14:03:02.711 2 INFO nova.virt.libvirt.driver [-] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] Instance destroyed successfully.
Oct 01 14:03:02 compute-0 nova_compute[192698]: 2025-10-01 14:03:02.713 2 DEBUG nova.objects.instance [None req-9c0b4ebc-434f-446a-b010-7e88b1dfe8e5 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Lazy-loading 'resources' on Instance uuid 6bbb3240-f185-4efc-9aaa-ed008923c68a obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:03:02 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-58f3385a49e28e004626b1c7cc6230d42fa70833fc6142a61786a5e9f232bcdb-userdata-shm.mount: Deactivated successfully.
Oct 01 14:03:02 compute-0 systemd[1]: var-lib-containers-storage-overlay-0db12906ce462c69e050dd076edb88eca5029e72b790b939833886dd78df0e33-merged.mount: Deactivated successfully.
Oct 01 14:03:02 compute-0 podman[215917]: 2025-10-01 14:03:02.73846128 +0000 UTC m=+0.089512228 container cleanup 58f3385a49e28e004626b1c7cc6230d42fa70833fc6142a61786a5e9f232bcdb (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-8b83843d-74e6-401a-9419-27491d8fece7, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct 01 14:03:02 compute-0 systemd[1]: libpod-conmon-58f3385a49e28e004626b1c7cc6230d42fa70833fc6142a61786a5e9f232bcdb.scope: Deactivated successfully.
Oct 01 14:03:02 compute-0 podman[215926]: 2025-10-01 14:03:02.756513657 +0000 UTC m=+0.078443939 container remove 58f3385a49e28e004626b1c7cc6230d42fa70833fc6142a61786a5e9f232bcdb (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-8b83843d-74e6-401a-9419-27491d8fece7, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4)
Oct 01 14:03:02 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:03:02.780 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[f519cf17-2775-4058-8e66-362fef65e0a3]: (4, ("Wed Oct  1 02:03:02 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-8b83843d-74e6-401a-9419-27491d8fece7 (58f3385a49e28e004626b1c7cc6230d42fa70833fc6142a61786a5e9f232bcdb)\n58f3385a49e28e004626b1c7cc6230d42fa70833fc6142a61786a5e9f232bcdb\nWed Oct  1 02:03:02 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-8b83843d-74e6-401a-9419-27491d8fece7 (58f3385a49e28e004626b1c7cc6230d42fa70833fc6142a61786a5e9f232bcdb)\n58f3385a49e28e004626b1c7cc6230d42fa70833fc6142a61786a5e9f232bcdb\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:03:02 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:03:02.782 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[8943ee51-c2b6-492a-aaf2-dd80cf1b5e2b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:03:02 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:03:02.783 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8b83843d-74e6-401a-9419-27491d8fece7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8b83843d-74e6-401a-9419-27491d8fece7.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:03:02 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:03:02.784 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[bdba3346-0c4e-48ba-8615-d91e657489f4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:03:02 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:03:02.785 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8b83843d-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:03:02 compute-0 nova_compute[192698]: 2025-10-01 14:03:02.788 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:03:02 compute-0 kernel: tap8b83843d-70: left promiscuous mode
Oct 01 14:03:02 compute-0 nova_compute[192698]: 2025-10-01 14:03:02.821 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:03:02 compute-0 nova_compute[192698]: 2025-10-01 14:03:02.824 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:03:02 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:03:02.825 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[8466bbaa-d351-4ebf-8160-962dcc541c25]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:03:02 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:03:02.857 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[90fb54d5-11cd-4bc5-acf1-f4ec3997c95f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:03:02 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:03:02.860 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[952b8019-2fdd-4105-b512-b97304925e8f]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:03:02 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:03:02.883 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[fa483077-5f9a-490a-808d-721a8e0331b4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 373707, 'reachable_time': 35699, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215964, 'error': None, 'target': 'ovnmeta-8b83843d-74e6-401a-9419-27491d8fece7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:03:02 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:03:02.892 103910 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8b83843d-74e6-401a-9419-27491d8fece7 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Oct 01 14:03:02 compute-0 systemd[1]: run-netns-ovnmeta\x2d8b83843d\x2d74e6\x2d401a\x2d9419\x2d27491d8fece7.mount: Deactivated successfully.
Oct 01 14:03:02 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:03:02.893 103910 DEBUG oslo.privsep.daemon [-] privsep: reply[0c62f637-5680-49ae-8a8a-62f20e778bda]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:03:03 compute-0 nova_compute[192698]: 2025-10-01 14:03:03.036 2 DEBUG nova.compute.manager [req-4cc11d49-3e11-40f6-9c4b-f05bd675ed46 req-ca6d8fdf-348e-409d-a4a9-3252f6552261 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] Received event network-vif-unplugged-6919878f-3d2e-4ab5-b34c-215a0ff9579b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:03:03 compute-0 nova_compute[192698]: 2025-10-01 14:03:03.037 2 DEBUG oslo_concurrency.lockutils [req-4cc11d49-3e11-40f6-9c4b-f05bd675ed46 req-ca6d8fdf-348e-409d-a4a9-3252f6552261 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "6bbb3240-f185-4efc-9aaa-ed008923c68a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:03:03 compute-0 nova_compute[192698]: 2025-10-01 14:03:03.039 2 DEBUG oslo_concurrency.lockutils [req-4cc11d49-3e11-40f6-9c4b-f05bd675ed46 req-ca6d8fdf-348e-409d-a4a9-3252f6552261 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "6bbb3240-f185-4efc-9aaa-ed008923c68a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:03:03 compute-0 nova_compute[192698]: 2025-10-01 14:03:03.039 2 DEBUG oslo_concurrency.lockutils [req-4cc11d49-3e11-40f6-9c4b-f05bd675ed46 req-ca6d8fdf-348e-409d-a4a9-3252f6552261 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "6bbb3240-f185-4efc-9aaa-ed008923c68a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:03:03 compute-0 nova_compute[192698]: 2025-10-01 14:03:03.040 2 DEBUG nova.compute.manager [req-4cc11d49-3e11-40f6-9c4b-f05bd675ed46 req-ca6d8fdf-348e-409d-a4a9-3252f6552261 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] No waiting events found dispatching network-vif-unplugged-6919878f-3d2e-4ab5-b34c-215a0ff9579b pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:03:03 compute-0 nova_compute[192698]: 2025-10-01 14:03:03.040 2 DEBUG nova.compute.manager [req-4cc11d49-3e11-40f6-9c4b-f05bd675ed46 req-ca6d8fdf-348e-409d-a4a9-3252f6552261 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] Received event network-vif-unplugged-6919878f-3d2e-4ab5-b34c-215a0ff9579b for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 01 14:03:03 compute-0 nova_compute[192698]: 2025-10-01 14:03:03.221 2 DEBUG nova.virt.libvirt.vif [None req-9c0b4ebc-434f-446a-b010-7e88b1dfe8e5 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-10-01T14:02:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestDataModel-server-1148779107',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testdatamodel-server-1148779107',id=3,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-01T14:02:56Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d557b5b6333c4a08801c674394739795',ramdisk_id='',reservation_id='r-nh11smmq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestDataModel-621315703',owner_user_name='tempest-TestDataModel-621315703-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-01T14:02:56Z,user_data=None,user_id='0874d87de40e4626b04c1e4b35a90268',uuid=6bbb3240-f185-4efc-9aaa-ed008923c68a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6919878f-3d2e-4ab5-b34c-215a0ff9579b", "address": "fa:16:3e:3b:f1:fb", "network": {"id": "8b83843d-74e6-401a-9419-27491d8fece7", "bridge": "br-int", "label": "tempest-TestDataModel-780178721-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "18a5e4cbf5004eb69c4a4632324f35d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6919878f-3d", "ovs_interfaceid": "6919878f-3d2e-4ab5-b34c-215a0ff9579b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 01 14:03:03 compute-0 nova_compute[192698]: 2025-10-01 14:03:03.223 2 DEBUG nova.network.os_vif_util [None req-9c0b4ebc-434f-446a-b010-7e88b1dfe8e5 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Converting VIF {"id": "6919878f-3d2e-4ab5-b34c-215a0ff9579b", "address": "fa:16:3e:3b:f1:fb", "network": {"id": "8b83843d-74e6-401a-9419-27491d8fece7", "bridge": "br-int", "label": "tempest-TestDataModel-780178721-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "18a5e4cbf5004eb69c4a4632324f35d0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6919878f-3d", "ovs_interfaceid": "6919878f-3d2e-4ab5-b34c-215a0ff9579b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:03:03 compute-0 nova_compute[192698]: 2025-10-01 14:03:03.224 2 DEBUG nova.network.os_vif_util [None req-9c0b4ebc-434f-446a-b010-7e88b1dfe8e5 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3b:f1:fb,bridge_name='br-int',has_traffic_filtering=True,id=6919878f-3d2e-4ab5-b34c-215a0ff9579b,network=Network(8b83843d-74e6-401a-9419-27491d8fece7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6919878f-3d') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:03:03 compute-0 nova_compute[192698]: 2025-10-01 14:03:03.225 2 DEBUG os_vif [None req-9c0b4ebc-434f-446a-b010-7e88b1dfe8e5 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3b:f1:fb,bridge_name='br-int',has_traffic_filtering=True,id=6919878f-3d2e-4ab5-b34c-215a0ff9579b,network=Network(8b83843d-74e6-401a-9419-27491d8fece7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6919878f-3d') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 01 14:03:03 compute-0 nova_compute[192698]: 2025-10-01 14:03:03.229 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:03:03 compute-0 nova_compute[192698]: 2025-10-01 14:03:03.230 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6919878f-3d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:03:03 compute-0 nova_compute[192698]: 2025-10-01 14:03:03.232 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:03:03 compute-0 nova_compute[192698]: 2025-10-01 14:03:03.235 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:03:03 compute-0 nova_compute[192698]: 2025-10-01 14:03:03.236 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:03:03 compute-0 nova_compute[192698]: 2025-10-01 14:03:03.237 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=8800eba0-908d-401a-b2eb-aa2b78e23043) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:03:03 compute-0 nova_compute[192698]: 2025-10-01 14:03:03.238 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:03:03 compute-0 nova_compute[192698]: 2025-10-01 14:03:03.239 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:03:03 compute-0 nova_compute[192698]: 2025-10-01 14:03:03.242 2 INFO os_vif [None req-9c0b4ebc-434f-446a-b010-7e88b1dfe8e5 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3b:f1:fb,bridge_name='br-int',has_traffic_filtering=True,id=6919878f-3d2e-4ab5-b34c-215a0ff9579b,network=Network(8b83843d-74e6-401a-9419-27491d8fece7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6919878f-3d')
Oct 01 14:03:03 compute-0 nova_compute[192698]: 2025-10-01 14:03:03.243 2 INFO nova.virt.libvirt.driver [None req-9c0b4ebc-434f-446a-b010-7e88b1dfe8e5 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] Deleting instance files /var/lib/nova/instances/6bbb3240-f185-4efc-9aaa-ed008923c68a_del
Oct 01 14:03:03 compute-0 nova_compute[192698]: 2025-10-01 14:03:03.245 2 INFO nova.virt.libvirt.driver [None req-9c0b4ebc-434f-446a-b010-7e88b1dfe8e5 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] Deletion of /var/lib/nova/instances/6bbb3240-f185-4efc-9aaa-ed008923c68a_del complete
Oct 01 14:03:03 compute-0 nova_compute[192698]: 2025-10-01 14:03:03.431 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:03:03 compute-0 nova_compute[192698]: 2025-10-01 14:03:03.761 2 INFO nova.compute.manager [None req-9c0b4ebc-434f-446a-b010-7e88b1dfe8e5 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] Took 1.33 seconds to destroy the instance on the hypervisor.
Oct 01 14:03:03 compute-0 nova_compute[192698]: 2025-10-01 14:03:03.763 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-9c0b4ebc-434f-446a-b010-7e88b1dfe8e5 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 01 14:03:03 compute-0 nova_compute[192698]: 2025-10-01 14:03:03.765 2 DEBUG nova.compute.manager [-] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 01 14:03:03 compute-0 nova_compute[192698]: 2025-10-01 14:03:03.765 2 DEBUG nova.network.neutron [-] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 01 14:03:03 compute-0 nova_compute[192698]: 2025-10-01 14:03:03.766 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:03:03 compute-0 nova_compute[192698]: 2025-10-01 14:03:03.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:03:03 compute-0 nova_compute[192698]: 2025-10-01 14:03:03.942 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:03:04 compute-0 nova_compute[192698]: 2025-10-01 14:03:04.439 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:03:04 compute-0 nova_compute[192698]: 2025-10-01 14:03:04.440 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:03:04 compute-0 nova_compute[192698]: 2025-10-01 14:03:04.441 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:03:04 compute-0 nova_compute[192698]: 2025-10-01 14:03:04.442 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 01 14:03:04 compute-0 nova_compute[192698]: 2025-10-01 14:03:04.670 2 WARNING nova.virt.libvirt.driver [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:03:04 compute-0 nova_compute[192698]: 2025-10-01 14:03:04.671 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:03:04 compute-0 nova_compute[192698]: 2025-10-01 14:03:04.696 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.024s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:03:04 compute-0 nova_compute[192698]: 2025-10-01 14:03:04.696 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5788MB free_disk=73.30675888061523GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 01 14:03:04 compute-0 nova_compute[192698]: 2025-10-01 14:03:04.697 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:03:04 compute-0 nova_compute[192698]: 2025-10-01 14:03:04.697 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:03:04 compute-0 nova_compute[192698]: 2025-10-01 14:03:04.723 2 DEBUG nova.network.neutron [-] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:03:05 compute-0 nova_compute[192698]: 2025-10-01 14:03:05.153 2 DEBUG nova.compute.manager [req-dbf0f74d-6c51-4fe9-a0df-5a7223adaaee req-91bf5ad0-6219-4ca5-844c-30b9415c9e15 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] Received event network-vif-unplugged-6919878f-3d2e-4ab5-b34c-215a0ff9579b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:03:05 compute-0 nova_compute[192698]: 2025-10-01 14:03:05.153 2 DEBUG oslo_concurrency.lockutils [req-dbf0f74d-6c51-4fe9-a0df-5a7223adaaee req-91bf5ad0-6219-4ca5-844c-30b9415c9e15 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "6bbb3240-f185-4efc-9aaa-ed008923c68a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:03:05 compute-0 nova_compute[192698]: 2025-10-01 14:03:05.154 2 DEBUG oslo_concurrency.lockutils [req-dbf0f74d-6c51-4fe9-a0df-5a7223adaaee req-91bf5ad0-6219-4ca5-844c-30b9415c9e15 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "6bbb3240-f185-4efc-9aaa-ed008923c68a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:03:05 compute-0 nova_compute[192698]: 2025-10-01 14:03:05.154 2 DEBUG oslo_concurrency.lockutils [req-dbf0f74d-6c51-4fe9-a0df-5a7223adaaee req-91bf5ad0-6219-4ca5-844c-30b9415c9e15 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "6bbb3240-f185-4efc-9aaa-ed008923c68a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:03:05 compute-0 nova_compute[192698]: 2025-10-01 14:03:05.155 2 DEBUG nova.compute.manager [req-dbf0f74d-6c51-4fe9-a0df-5a7223adaaee req-91bf5ad0-6219-4ca5-844c-30b9415c9e15 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] No waiting events found dispatching network-vif-unplugged-6919878f-3d2e-4ab5-b34c-215a0ff9579b pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:03:05 compute-0 nova_compute[192698]: 2025-10-01 14:03:05.155 2 DEBUG nova.compute.manager [req-dbf0f74d-6c51-4fe9-a0df-5a7223adaaee req-91bf5ad0-6219-4ca5-844c-30b9415c9e15 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] Received event network-vif-unplugged-6919878f-3d2e-4ab5-b34c-215a0ff9579b for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 01 14:03:05 compute-0 nova_compute[192698]: 2025-10-01 14:03:05.156 2 DEBUG nova.compute.manager [req-dbf0f74d-6c51-4fe9-a0df-5a7223adaaee req-91bf5ad0-6219-4ca5-844c-30b9415c9e15 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] Received event network-vif-deleted-6919878f-3d2e-4ab5-b34c-215a0ff9579b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:03:05 compute-0 nova_compute[192698]: 2025-10-01 14:03:05.229 2 INFO nova.compute.manager [-] [instance: 6bbb3240-f185-4efc-9aaa-ed008923c68a] Took 1.46 seconds to deallocate network for instance.
Oct 01 14:03:05 compute-0 nova_compute[192698]: 2025-10-01 14:03:05.748 2 DEBUG oslo_concurrency.lockutils [None req-9c0b4ebc-434f-446a-b010-7e88b1dfe8e5 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:03:05 compute-0 nova_compute[192698]: 2025-10-01 14:03:05.753 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Instance 6bbb3240-f185-4efc-9aaa-ed008923c68a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Oct 01 14:03:05 compute-0 nova_compute[192698]: 2025-10-01 14:03:05.754 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 01 14:03:05 compute-0 nova_compute[192698]: 2025-10-01 14:03:05.754 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:03:04 up  1:02,  0 user,  load average: 0.36, 0.40, 0.54\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_deleting': '1', 'num_os_type_None': '1', 'num_proj_d557b5b6333c4a08801c674394739795': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 01 14:03:05 compute-0 nova_compute[192698]: 2025-10-01 14:03:05.796 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Updating inventory in ProviderTree for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Oct 01 14:03:06 compute-0 nova_compute[192698]: 2025-10-01 14:03:06.325 2 ERROR nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] [req-984a3e54-7ec3-4255-ae9a-6104db237d31] Failed to update inventory to [{'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID ee1e54f5-453b-4949-a499-9a192f03b8f0.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-984a3e54-7ec3-4255-ae9a-6104db237d31"}]}
Oct 01 14:03:06 compute-0 nova_compute[192698]: 2025-10-01 14:03:06.350 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Refreshing inventories for resource provider ee1e54f5-453b-4949-a499-9a192f03b8f0 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Oct 01 14:03:06 compute-0 nova_compute[192698]: 2025-10-01 14:03:06.365 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Updating ProviderTree inventory for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Oct 01 14:03:06 compute-0 nova_compute[192698]: 2025-10-01 14:03:06.366 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Updating inventory in ProviderTree for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Oct 01 14:03:06 compute-0 nova_compute[192698]: 2025-10-01 14:03:06.383 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Refreshing aggregate associations for resource provider ee1e54f5-453b-4949-a499-9a192f03b8f0, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Oct 01 14:03:06 compute-0 nova_compute[192698]: 2025-10-01 14:03:06.405 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Refreshing trait associations for resource provider ee1e54f5-453b-4949-a499-9a192f03b8f0, traits: COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_TPM_TIS,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_BMI2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_AVX2,COMPUTE_ARCH_X86_64,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SHA,COMPUTE_SOUND_MODEL_AC97,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_SOUND_MODEL_ES1370,HW_ARCH_X86_64,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_DEVICE_TAGGING,COMPUTE_SECURITY_TPM_CRB,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SOUND_MODEL_SB16,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SOUND_MODEL_USB,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_ACCELERATORS,HW_CPU_X86_CLMUL,HW_CPU_X86_AESNI,COMPUTE_NODE,HW_CPU_X86_SSSE3,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_RESCUE_BFV,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_FMA3,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_AVX,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_F16C,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_ABM,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_VIRTIO_FS,HW_CPU_X86_SSE2,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE4A,HW_CPU_X86_SVM _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Oct 01 14:03:06 compute-0 nova_compute[192698]: 2025-10-01 14:03:06.450 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Updating inventory in ProviderTree for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Oct 01 14:03:06 compute-0 nova_compute[192698]: 2025-10-01 14:03:06.996 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Updated inventory for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:975
Oct 01 14:03:06 compute-0 nova_compute[192698]: 2025-10-01 14:03:06.996 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Updating resource provider ee1e54f5-453b-4949-a499-9a192f03b8f0 generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Oct 01 14:03:06 compute-0 nova_compute[192698]: 2025-10-01 14:03:06.997 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Updating inventory in ProviderTree for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Oct 01 14:03:07 compute-0 nova_compute[192698]: 2025-10-01 14:03:07.067 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:03:07 compute-0 nova_compute[192698]: 2025-10-01 14:03:07.507 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 01 14:03:07 compute-0 nova_compute[192698]: 2025-10-01 14:03:07.508 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.811s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:03:07 compute-0 nova_compute[192698]: 2025-10-01 14:03:07.509 2 DEBUG oslo_concurrency.lockutils [None req-9c0b4ebc-434f-446a-b010-7e88b1dfe8e5 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 1.761s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:03:07 compute-0 nova_compute[192698]: 2025-10-01 14:03:07.554 2 DEBUG nova.compute.provider_tree [None req-9c0b4ebc-434f-446a-b010-7e88b1dfe8e5 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:03:08 compute-0 nova_compute[192698]: 2025-10-01 14:03:08.062 2 DEBUG nova.scheduler.client.report [None req-9c0b4ebc-434f-446a-b010-7e88b1dfe8e5 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:03:08 compute-0 podman[215969]: 2025-10-01 14:03:08.202110565 +0000 UTC m=+0.099664895 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4)
Oct 01 14:03:08 compute-0 nova_compute[192698]: 2025-10-01 14:03:08.268 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:03:08 compute-0 podman[215970]: 2025-10-01 14:03:08.321578344 +0000 UTC m=+0.214606711 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Oct 01 14:03:08 compute-0 nova_compute[192698]: 2025-10-01 14:03:08.509 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:03:08 compute-0 nova_compute[192698]: 2025-10-01 14:03:08.510 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:03:08 compute-0 nova_compute[192698]: 2025-10-01 14:03:08.510 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:03:08 compute-0 nova_compute[192698]: 2025-10-01 14:03:08.511 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:03:08 compute-0 nova_compute[192698]: 2025-10-01 14:03:08.511 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:03:08 compute-0 nova_compute[192698]: 2025-10-01 14:03:08.575 2 DEBUG oslo_concurrency.lockutils [None req-9c0b4ebc-434f-446a-b010-7e88b1dfe8e5 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.066s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:03:08 compute-0 nova_compute[192698]: 2025-10-01 14:03:08.609 2 INFO nova.scheduler.client.report [None req-9c0b4ebc-434f-446a-b010-7e88b1dfe8e5 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Deleted allocations for instance 6bbb3240-f185-4efc-9aaa-ed008923c68a
Oct 01 14:03:09 compute-0 nova_compute[192698]: 2025-10-01 14:03:09.643 2 DEBUG oslo_concurrency.lockutils [None req-9c0b4ebc-434f-446a-b010-7e88b1dfe8e5 0874d87de40e4626b04c1e4b35a90268 d557b5b6333c4a08801c674394739795 - - default default] Lock "6bbb3240-f185-4efc-9aaa-ed008923c68a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.754s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:03:09 compute-0 nova_compute[192698]: 2025-10-01 14:03:09.924 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:03:09 compute-0 nova_compute[192698]: 2025-10-01 14:03:09.925 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 01 14:03:10 compute-0 nova_compute[192698]: 2025-10-01 14:03:10.926 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:03:10 compute-0 nova_compute[192698]: 2025-10-01 14:03:10.927 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Oct 01 14:03:11 compute-0 nova_compute[192698]: 2025-10-01 14:03:11.435 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Oct 01 14:03:12 compute-0 nova_compute[192698]: 2025-10-01 14:03:12.070 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:03:12 compute-0 nova_compute[192698]: 2025-10-01 14:03:12.423 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:03:13 compute-0 nova_compute[192698]: 2025-10-01 14:03:13.271 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:03:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:03:14.230 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:03:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:03:14.230 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:03:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:03:14.230 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:03:16 compute-0 podman[216014]: 2025-10-01 14:03:16.184552084 +0000 UTC m=+0.091133124 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, name=ubi9-minimal, config_id=edpm, com.redhat.component=ubi9-minimal-container, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., version=9.6, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, distribution-scope=public, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct 01 14:03:17 compute-0 nova_compute[192698]: 2025-10-01 14:03:17.121 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:03:18 compute-0 nova_compute[192698]: 2025-10-01 14:03:18.313 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:03:21 compute-0 nova_compute[192698]: 2025-10-01 14:03:21.265 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:03:22 compute-0 nova_compute[192698]: 2025-10-01 14:03:22.154 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:03:22 compute-0 podman[216037]: 2025-10-01 14:03:22.226197823 +0000 UTC m=+0.133914560 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Oct 01 14:03:22 compute-0 podman[216038]: 2025-10-01 14:03:22.257189711 +0000 UTC m=+0.155418202 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct 01 14:03:23 compute-0 nova_compute[192698]: 2025-10-01 14:03:23.355 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:03:27 compute-0 nova_compute[192698]: 2025-10-01 14:03:27.201 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:03:28 compute-0 nova_compute[192698]: 2025-10-01 14:03:28.395 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:03:29 compute-0 podman[203144]: time="2025-10-01T14:03:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:03:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:03:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:03:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:03:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3000 "" "Go-http-client/1.1"
Oct 01 14:03:30 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:03:30.036 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:47:1b:50 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-e35f096a-fd75-4d70-ae58-8a76ae666b9d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e35f096a-fd75-4d70-ae58-8a76ae666b9d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b14b3910fae84828afa468e1e645402b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e3a455d-1f77-441e-b08a-0ec8231910e5, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=3f9111f1-79b1-4bf1-bb95-d924c71fb42c) old=Port_Binding(mac=['fa:16:3e:47:1b:50'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-e35f096a-fd75-4d70-ae58-8a76ae666b9d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e35f096a-fd75-4d70-ae58-8a76ae666b9d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b14b3910fae84828afa468e1e645402b', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:03:30 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:03:30.037 103791 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 3f9111f1-79b1-4bf1-bb95-d924c71fb42c in datapath e35f096a-fd75-4d70-ae58-8a76ae666b9d updated
Oct 01 14:03:30 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:03:30.038 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e35f096a-fd75-4d70-ae58-8a76ae666b9d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 01 14:03:30 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:03:30.039 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[0c0a1aef-0887-4254-802d-3c93a73a73c1]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:03:30 compute-0 podman[216074]: 2025-10-01 14:03:30.17970539 +0000 UTC m=+0.085306927 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 01 14:03:31 compute-0 openstack_network_exporter[205307]: ERROR   14:03:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:03:31 compute-0 openstack_network_exporter[205307]: ERROR   14:03:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:03:31 compute-0 openstack_network_exporter[205307]: ERROR   14:03:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:03:31 compute-0 openstack_network_exporter[205307]: ERROR   14:03:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:03:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:03:31 compute-0 openstack_network_exporter[205307]: ERROR   14:03:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:03:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:03:32 compute-0 nova_compute[192698]: 2025-10-01 14:03:32.248 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:03:33 compute-0 nova_compute[192698]: 2025-10-01 14:03:33.399 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:03:37 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:03:37.161 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:81:89:c0 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-3fe23c69-48b8-42a9-a4f7-ae8043dbee47', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3fe23c69-48b8-42a9-a4f7-ae8043dbee47', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '67079b4774294271895bbf7b04f602e7', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ce5354ea-08fd-4704-b059-2156ff7cb5fc, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=39742005-a256-4528-872f-8d19a1c08e02) old=Port_Binding(mac=['fa:16:3e:81:89:c0'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-3fe23c69-48b8-42a9-a4f7-ae8043dbee47', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3fe23c69-48b8-42a9-a4f7-ae8043dbee47', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '67079b4774294271895bbf7b04f602e7', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:03:37 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:03:37.163 103791 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 39742005-a256-4528-872f-8d19a1c08e02 in datapath 3fe23c69-48b8-42a9-a4f7-ae8043dbee47 updated
Oct 01 14:03:37 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:03:37.164 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3fe23c69-48b8-42a9-a4f7-ae8043dbee47, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 01 14:03:37 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:03:37.165 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[cd9ce9fc-7625-41c3-b4bd-dae710ae8498]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:03:37 compute-0 nova_compute[192698]: 2025-10-01 14:03:37.283 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:03:38 compute-0 nova_compute[192698]: 2025-10-01 14:03:38.402 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:03:39 compute-0 podman[216098]: 2025-10-01 14:03:39.191779607 +0000 UTC m=+0.099044928 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Oct 01 14:03:39 compute-0 podman[216099]: 2025-10-01 14:03:39.272939781 +0000 UTC m=+0.171586299 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930)
Oct 01 14:03:42 compute-0 nova_compute[192698]: 2025-10-01 14:03:42.286 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:03:43 compute-0 nova_compute[192698]: 2025-10-01 14:03:43.404 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:03:47 compute-0 podman[216143]: 2025-10-01 14:03:47.180386242 +0000 UTC m=+0.088030870 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-type=git, architecture=x86_64, container_name=openstack_network_exporter, release=1755695350, config_id=edpm, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc.)
Oct 01 14:03:47 compute-0 nova_compute[192698]: 2025-10-01 14:03:47.324 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:03:48 compute-0 nova_compute[192698]: 2025-10-01 14:03:48.448 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:03:52 compute-0 nova_compute[192698]: 2025-10-01 14:03:52.326 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:03:53 compute-0 podman[216165]: 2025-10-01 14:03:53.173990324 +0000 UTC m=+0.083311353 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20250930)
Oct 01 14:03:53 compute-0 podman[216164]: 2025-10-01 14:03:53.182594526 +0000 UTC m=+0.091952666 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20250930)
Oct 01 14:03:53 compute-0 nova_compute[192698]: 2025-10-01 14:03:53.451 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:03:54 compute-0 ovn_controller[94909]: 2025-10-01T14:03:54Z|00048|memory_trim|INFO|Detected inactivity (last active 30012 ms ago): trimming memory
Oct 01 14:03:57 compute-0 nova_compute[192698]: 2025-10-01 14:03:57.376 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:03:58 compute-0 nova_compute[192698]: 2025-10-01 14:03:58.101 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:03:58 compute-0 nova_compute[192698]: 2025-10-01 14:03:58.489 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:03:59 compute-0 podman[203144]: time="2025-10-01T14:03:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:03:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:03:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:03:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:03:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3000 "" "Go-http-client/1.1"
Oct 01 14:04:00 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:04:00.235 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'e2:3f:3c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:1d:a6:67:ed:e6'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:04:00 compute-0 nova_compute[192698]: 2025-10-01 14:04:00.236 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:04:00 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:04:00.236 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 01 14:04:01 compute-0 podman[216202]: 2025-10-01 14:04:01.181498109 +0000 UTC m=+0.087532387 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 01 14:04:01 compute-0 openstack_network_exporter[205307]: ERROR   14:04:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:04:01 compute-0 openstack_network_exporter[205307]: ERROR   14:04:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:04:01 compute-0 openstack_network_exporter[205307]: ERROR   14:04:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:04:01 compute-0 openstack_network_exporter[205307]: ERROR   14:04:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:04:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:04:01 compute-0 openstack_network_exporter[205307]: ERROR   14:04:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:04:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:04:02 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:04:02.239 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=10cf9814-09fa-4bad-879a-270f9b64eda3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:04:02 compute-0 nova_compute[192698]: 2025-10-01 14:04:02.377 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:04:03 compute-0 nova_compute[192698]: 2025-10-01 14:04:03.437 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:04:03 compute-0 nova_compute[192698]: 2025-10-01 14:04:03.491 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:04:05 compute-0 nova_compute[192698]: 2025-10-01 14:04:05.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:04:05 compute-0 nova_compute[192698]: 2025-10-01 14:04:05.926 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:04:06 compute-0 nova_compute[192698]: 2025-10-01 14:04:06.449 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:04:06 compute-0 nova_compute[192698]: 2025-10-01 14:04:06.450 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:04:06 compute-0 nova_compute[192698]: 2025-10-01 14:04:06.450 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:04:06 compute-0 nova_compute[192698]: 2025-10-01 14:04:06.451 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 01 14:04:06 compute-0 nova_compute[192698]: 2025-10-01 14:04:06.684 2 WARNING nova.virt.libvirt.driver [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:04:06 compute-0 nova_compute[192698]: 2025-10-01 14:04:06.686 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:04:06 compute-0 nova_compute[192698]: 2025-10-01 14:04:06.731 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.044s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:04:06 compute-0 nova_compute[192698]: 2025-10-01 14:04:06.732 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5891MB free_disk=73.30677795410156GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 01 14:04:06 compute-0 nova_compute[192698]: 2025-10-01 14:04:06.732 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:04:06 compute-0 nova_compute[192698]: 2025-10-01 14:04:06.733 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:04:07 compute-0 nova_compute[192698]: 2025-10-01 14:04:07.421 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:04:07 compute-0 nova_compute[192698]: 2025-10-01 14:04:07.843 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 01 14:04:07 compute-0 nova_compute[192698]: 2025-10-01 14:04:07.844 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:04:06 up  1:03,  0 user,  load average: 0.16, 0.33, 0.51\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 01 14:04:07 compute-0 nova_compute[192698]: 2025-10-01 14:04:07.905 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:04:08 compute-0 nova_compute[192698]: 2025-10-01 14:04:08.414 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:04:08 compute-0 nova_compute[192698]: 2025-10-01 14:04:08.536 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:04:08 compute-0 nova_compute[192698]: 2025-10-01 14:04:08.926 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 01 14:04:08 compute-0 nova_compute[192698]: 2025-10-01 14:04:08.927 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.194s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:04:10 compute-0 podman[216229]: 2025-10-01 14:04:10.197838792 +0000 UTC m=+0.097549837 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct 01 14:04:10 compute-0 podman[216230]: 2025-10-01 14:04:10.255279555 +0000 UTC m=+0.148023622 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.build-date=20250930)
Oct 01 14:04:10 compute-0 nova_compute[192698]: 2025-10-01 14:04:10.916 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:04:10 compute-0 nova_compute[192698]: 2025-10-01 14:04:10.917 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:04:10 compute-0 nova_compute[192698]: 2025-10-01 14:04:10.917 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:04:10 compute-0 nova_compute[192698]: 2025-10-01 14:04:10.917 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:04:10 compute-0 nova_compute[192698]: 2025-10-01 14:04:10.924 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:04:10 compute-0 nova_compute[192698]: 2025-10-01 14:04:10.924 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 01 14:04:12 compute-0 nova_compute[192698]: 2025-10-01 14:04:12.427 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:04:13 compute-0 nova_compute[192698]: 2025-10-01 14:04:13.573 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:04:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:04:14.231 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:04:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:04:14.232 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:04:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:04:14.232 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:04:17 compute-0 nova_compute[192698]: 2025-10-01 14:04:17.024 2 DEBUG oslo_concurrency.lockutils [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Acquiring lock "28407011-1056-4714-96fc-1e8904bbcf1f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:04:17 compute-0 nova_compute[192698]: 2025-10-01 14:04:17.025 2 DEBUG oslo_concurrency.lockutils [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "28407011-1056-4714-96fc-1e8904bbcf1f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:04:17 compute-0 nova_compute[192698]: 2025-10-01 14:04:17.430 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:04:17 compute-0 nova_compute[192698]: 2025-10-01 14:04:17.532 2 DEBUG nova.compute.manager [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Oct 01 14:04:18 compute-0 nova_compute[192698]: 2025-10-01 14:04:18.082 2 DEBUG oslo_concurrency.lockutils [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:04:18 compute-0 nova_compute[192698]: 2025-10-01 14:04:18.083 2 DEBUG oslo_concurrency.lockutils [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:04:18 compute-0 nova_compute[192698]: 2025-10-01 14:04:18.091 2 DEBUG nova.virt.hardware [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Oct 01 14:04:18 compute-0 nova_compute[192698]: 2025-10-01 14:04:18.091 2 INFO nova.compute.claims [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] Claim successful on node compute-0.ctlplane.example.com
Oct 01 14:04:18 compute-0 podman[216275]: 2025-10-01 14:04:18.212138053 +0000 UTC m=+0.121059853 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.33.7, managed_by=edpm_ansible, release=1755695350, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, architecture=x86_64, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct 01 14:04:18 compute-0 nova_compute[192698]: 2025-10-01 14:04:18.576 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:04:19 compute-0 nova_compute[192698]: 2025-10-01 14:04:19.243 2 DEBUG nova.compute.provider_tree [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:04:19 compute-0 nova_compute[192698]: 2025-10-01 14:04:19.752 2 DEBUG nova.scheduler.client.report [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:04:20 compute-0 nova_compute[192698]: 2025-10-01 14:04:20.266 2 DEBUG oslo_concurrency.lockutils [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.183s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:04:20 compute-0 nova_compute[192698]: 2025-10-01 14:04:20.268 2 DEBUG nova.compute.manager [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Oct 01 14:04:20 compute-0 nova_compute[192698]: 2025-10-01 14:04:20.781 2 DEBUG nova.compute.manager [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Oct 01 14:04:20 compute-0 nova_compute[192698]: 2025-10-01 14:04:20.782 2 DEBUG nova.network.neutron [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Oct 01 14:04:20 compute-0 nova_compute[192698]: 2025-10-01 14:04:20.783 2 WARNING neutronclient.v2_0.client [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:04:20 compute-0 nova_compute[192698]: 2025-10-01 14:04:20.783 2 WARNING neutronclient.v2_0.client [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:04:21 compute-0 nova_compute[192698]: 2025-10-01 14:04:21.294 2 INFO nova.virt.libvirt.driver [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 01 14:04:21 compute-0 nova_compute[192698]: 2025-10-01 14:04:21.396 2 DEBUG nova.network.neutron [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] Successfully created port: 1a9d8f85-cd26-4e65-b316-4dbc35e89aca _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Oct 01 14:04:21 compute-0 nova_compute[192698]: 2025-10-01 14:04:21.804 2 DEBUG nova.compute.manager [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Oct 01 14:04:22 compute-0 nova_compute[192698]: 2025-10-01 14:04:22.004 2 DEBUG nova.network.neutron [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] Successfully updated port: 1a9d8f85-cd26-4e65-b316-4dbc35e89aca _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Oct 01 14:04:22 compute-0 nova_compute[192698]: 2025-10-01 14:04:22.071 2 DEBUG nova.compute.manager [req-f466666f-7405-4ed7-86fc-89ad365d959a req-ff762702-212a-44eb-bf0c-0527953be5d3 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] Received event network-changed-1a9d8f85-cd26-4e65-b316-4dbc35e89aca external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:04:22 compute-0 nova_compute[192698]: 2025-10-01 14:04:22.072 2 DEBUG nova.compute.manager [req-f466666f-7405-4ed7-86fc-89ad365d959a req-ff762702-212a-44eb-bf0c-0527953be5d3 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] Refreshing instance network info cache due to event network-changed-1a9d8f85-cd26-4e65-b316-4dbc35e89aca. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Oct 01 14:04:22 compute-0 nova_compute[192698]: 2025-10-01 14:04:22.072 2 DEBUG oslo_concurrency.lockutils [req-f466666f-7405-4ed7-86fc-89ad365d959a req-ff762702-212a-44eb-bf0c-0527953be5d3 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "refresh_cache-28407011-1056-4714-96fc-1e8904bbcf1f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 01 14:04:22 compute-0 nova_compute[192698]: 2025-10-01 14:04:22.073 2 DEBUG oslo_concurrency.lockutils [req-f466666f-7405-4ed7-86fc-89ad365d959a req-ff762702-212a-44eb-bf0c-0527953be5d3 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquired lock "refresh_cache-28407011-1056-4714-96fc-1e8904bbcf1f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 01 14:04:22 compute-0 nova_compute[192698]: 2025-10-01 14:04:22.073 2 DEBUG nova.network.neutron [req-f466666f-7405-4ed7-86fc-89ad365d959a req-ff762702-212a-44eb-bf0c-0527953be5d3 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] Refreshing network info cache for port 1a9d8f85-cd26-4e65-b316-4dbc35e89aca _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Oct 01 14:04:22 compute-0 nova_compute[192698]: 2025-10-01 14:04:22.433 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:04:22 compute-0 nova_compute[192698]: 2025-10-01 14:04:22.511 2 DEBUG oslo_concurrency.lockutils [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Acquiring lock "refresh_cache-28407011-1056-4714-96fc-1e8904bbcf1f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 01 14:04:22 compute-0 nova_compute[192698]: 2025-10-01 14:04:22.579 2 WARNING neutronclient.v2_0.client [req-f466666f-7405-4ed7-86fc-89ad365d959a req-ff762702-212a-44eb-bf0c-0527953be5d3 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:04:22 compute-0 nova_compute[192698]: 2025-10-01 14:04:22.721 2 DEBUG nova.network.neutron [req-f466666f-7405-4ed7-86fc-89ad365d959a req-ff762702-212a-44eb-bf0c-0527953be5d3 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 01 14:04:22 compute-0 nova_compute[192698]: 2025-10-01 14:04:22.828 2 DEBUG nova.compute.manager [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Oct 01 14:04:22 compute-0 nova_compute[192698]: 2025-10-01 14:04:22.830 2 DEBUG nova.virt.libvirt.driver [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Oct 01 14:04:22 compute-0 nova_compute[192698]: 2025-10-01 14:04:22.831 2 INFO nova.virt.libvirt.driver [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] Creating image(s)
Oct 01 14:04:22 compute-0 nova_compute[192698]: 2025-10-01 14:04:22.832 2 DEBUG oslo_concurrency.lockutils [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Acquiring lock "/var/lib/nova/instances/28407011-1056-4714-96fc-1e8904bbcf1f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:04:22 compute-0 nova_compute[192698]: 2025-10-01 14:04:22.832 2 DEBUG oslo_concurrency.lockutils [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "/var/lib/nova/instances/28407011-1056-4714-96fc-1e8904bbcf1f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:04:22 compute-0 nova_compute[192698]: 2025-10-01 14:04:22.833 2 DEBUG oslo_concurrency.lockutils [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "/var/lib/nova/instances/28407011-1056-4714-96fc-1e8904bbcf1f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:04:22 compute-0 nova_compute[192698]: 2025-10-01 14:04:22.834 2 DEBUG oslo_utils.imageutils.format_inspector [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:04:22 compute-0 nova_compute[192698]: 2025-10-01 14:04:22.840 2 DEBUG oslo_utils.imageutils.format_inspector [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:04:22 compute-0 nova_compute[192698]: 2025-10-01 14:04:22.844 2 DEBUG oslo_concurrency.processutils [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:04:22 compute-0 nova_compute[192698]: 2025-10-01 14:04:22.878 2 DEBUG nova.network.neutron [req-f466666f-7405-4ed7-86fc-89ad365d959a req-ff762702-212a-44eb-bf0c-0527953be5d3 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:04:22 compute-0 nova_compute[192698]: 2025-10-01 14:04:22.928 2 DEBUG oslo_concurrency.processutils [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:04:22 compute-0 nova_compute[192698]: 2025-10-01 14:04:22.929 2 DEBUG oslo_concurrency.lockutils [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Acquiring lock "f477473ce09fdc00484ca839f539813eb2fee546" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:04:22 compute-0 nova_compute[192698]: 2025-10-01 14:04:22.929 2 DEBUG oslo_concurrency.lockutils [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "f477473ce09fdc00484ca839f539813eb2fee546" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:04:22 compute-0 nova_compute[192698]: 2025-10-01 14:04:22.930 2 DEBUG oslo_utils.imageutils.format_inspector [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:04:22 compute-0 nova_compute[192698]: 2025-10-01 14:04:22.934 2 DEBUG oslo_utils.imageutils.format_inspector [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:04:22 compute-0 nova_compute[192698]: 2025-10-01 14:04:22.935 2 DEBUG oslo_concurrency.processutils [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:04:23 compute-0 nova_compute[192698]: 2025-10-01 14:04:23.019 2 DEBUG oslo_concurrency.processutils [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:04:23 compute-0 nova_compute[192698]: 2025-10-01 14:04:23.021 2 DEBUG oslo_concurrency.processutils [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546,backing_fmt=raw /var/lib/nova/instances/28407011-1056-4714-96fc-1e8904bbcf1f/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:04:23 compute-0 nova_compute[192698]: 2025-10-01 14:04:23.074 2 DEBUG oslo_concurrency.processutils [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546,backing_fmt=raw /var/lib/nova/instances/28407011-1056-4714-96fc-1e8904bbcf1f/disk 1073741824" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:04:23 compute-0 nova_compute[192698]: 2025-10-01 14:04:23.075 2 DEBUG oslo_concurrency.lockutils [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "f477473ce09fdc00484ca839f539813eb2fee546" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.145s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:04:23 compute-0 nova_compute[192698]: 2025-10-01 14:04:23.075 2 DEBUG oslo_concurrency.processutils [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:04:23 compute-0 nova_compute[192698]: 2025-10-01 14:04:23.151 2 DEBUG oslo_concurrency.processutils [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:04:23 compute-0 nova_compute[192698]: 2025-10-01 14:04:23.152 2 DEBUG nova.virt.disk.api [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Checking if we can resize image /var/lib/nova/instances/28407011-1056-4714-96fc-1e8904bbcf1f/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 01 14:04:23 compute-0 nova_compute[192698]: 2025-10-01 14:04:23.153 2 DEBUG oslo_concurrency.processutils [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/28407011-1056-4714-96fc-1e8904bbcf1f/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:04:23 compute-0 nova_compute[192698]: 2025-10-01 14:04:23.205 2 DEBUG oslo_concurrency.processutils [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/28407011-1056-4714-96fc-1e8904bbcf1f/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:04:23 compute-0 nova_compute[192698]: 2025-10-01 14:04:23.207 2 DEBUG nova.virt.disk.api [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Cannot resize image /var/lib/nova/instances/28407011-1056-4714-96fc-1e8904bbcf1f/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 01 14:04:23 compute-0 nova_compute[192698]: 2025-10-01 14:04:23.208 2 DEBUG nova.virt.libvirt.driver [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Oct 01 14:04:23 compute-0 nova_compute[192698]: 2025-10-01 14:04:23.208 2 DEBUG nova.virt.libvirt.driver [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] Ensure instance console log exists: /var/lib/nova/instances/28407011-1056-4714-96fc-1e8904bbcf1f/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Oct 01 14:04:23 compute-0 nova_compute[192698]: 2025-10-01 14:04:23.209 2 DEBUG oslo_concurrency.lockutils [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:04:23 compute-0 nova_compute[192698]: 2025-10-01 14:04:23.209 2 DEBUG oslo_concurrency.lockutils [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:04:23 compute-0 nova_compute[192698]: 2025-10-01 14:04:23.210 2 DEBUG oslo_concurrency.lockutils [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:04:23 compute-0 nova_compute[192698]: 2025-10-01 14:04:23.387 2 DEBUG oslo_concurrency.lockutils [req-f466666f-7405-4ed7-86fc-89ad365d959a req-ff762702-212a-44eb-bf0c-0527953be5d3 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Releasing lock "refresh_cache-28407011-1056-4714-96fc-1e8904bbcf1f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 01 14:04:23 compute-0 nova_compute[192698]: 2025-10-01 14:04:23.388 2 DEBUG oslo_concurrency.lockutils [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Acquired lock "refresh_cache-28407011-1056-4714-96fc-1e8904bbcf1f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 01 14:04:23 compute-0 nova_compute[192698]: 2025-10-01 14:04:23.389 2 DEBUG nova.network.neutron [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 01 14:04:23 compute-0 nova_compute[192698]: 2025-10-01 14:04:23.578 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:04:23 compute-0 nova_compute[192698]: 2025-10-01 14:04:23.992 2 DEBUG nova.network.neutron [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 01 14:04:24 compute-0 podman[216312]: 2025-10-01 14:04:24.188139969 +0000 UTC m=+0.092113081 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Oct 01 14:04:24 compute-0 podman[216311]: 2025-10-01 14:04:24.188353335 +0000 UTC m=+0.096772927 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, container_name=iscsid, managed_by=edpm_ansible)
Oct 01 14:04:24 compute-0 nova_compute[192698]: 2025-10-01 14:04:24.298 2 WARNING neutronclient.v2_0.client [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:04:24 compute-0 nova_compute[192698]: 2025-10-01 14:04:24.471 2 DEBUG nova.network.neutron [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] Updating instance_info_cache with network_info: [{"id": "1a9d8f85-cd26-4e65-b316-4dbc35e89aca", "address": "fa:16:3e:17:68:4f", "network": {"id": "e35f096a-fd75-4d70-ae58-8a76ae666b9d", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1299231587-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b14b3910fae84828afa468e1e645402b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a9d8f85-cd", "ovs_interfaceid": "1a9d8f85-cd26-4e65-b316-4dbc35e89aca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:04:24 compute-0 nova_compute[192698]: 2025-10-01 14:04:24.977 2 DEBUG oslo_concurrency.lockutils [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Releasing lock "refresh_cache-28407011-1056-4714-96fc-1e8904bbcf1f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 01 14:04:24 compute-0 nova_compute[192698]: 2025-10-01 14:04:24.978 2 DEBUG nova.compute.manager [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] Instance network_info: |[{"id": "1a9d8f85-cd26-4e65-b316-4dbc35e89aca", "address": "fa:16:3e:17:68:4f", "network": {"id": "e35f096a-fd75-4d70-ae58-8a76ae666b9d", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1299231587-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b14b3910fae84828afa468e1e645402b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a9d8f85-cd", "ovs_interfaceid": "1a9d8f85-cd26-4e65-b316-4dbc35e89aca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Oct 01 14:04:24 compute-0 nova_compute[192698]: 2025-10-01 14:04:24.982 2 DEBUG nova.virt.libvirt.driver [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] Start _get_guest_xml network_info=[{"id": "1a9d8f85-cd26-4e65-b316-4dbc35e89aca", "address": "fa:16:3e:17:68:4f", "network": {"id": "e35f096a-fd75-4d70-ae58-8a76ae666b9d", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1299231587-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b14b3910fae84828afa468e1e645402b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a9d8f85-cd", "ovs_interfaceid": "1a9d8f85-cd26-4e65-b316-4dbc35e89aca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-01T13:57:39Z,direct_url=<?>,disk_format='qcow2',id=48696e9b-a20d-4bf6-8ac2-6438fe748ab6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9dacac6049d34f02846f752af09ae16f',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-01T13:57:40Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encrypted': False, 'encryption_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_options': None, 'device_name': '/dev/vda', 'guest_format': None, 'image_id': '48696e9b-a20d-4bf6-8ac2-6438fe748ab6'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Oct 01 14:04:24 compute-0 nova_compute[192698]: 2025-10-01 14:04:24.990 2 WARNING nova.virt.libvirt.driver [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:04:24 compute-0 nova_compute[192698]: 2025-10-01 14:04:24.992 2 DEBUG nova.virt.driver [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='48696e9b-a20d-4bf6-8ac2-6438fe748ab6', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteActionsViaActuator-server-757789601', uuid='28407011-1056-4714-96fc-1e8904bbcf1f'), owner=OwnerMeta(userid='82619989ef1f48a39f1c1e7d64e4cb38', username='tempest-TestExecuteActionsViaActuator-2075848047-project-admin', projectid='67079b4774294271895bbf7b04f602e7', projectname='tempest-TestExecuteActionsViaActuator-2075848047'), image=ImageMeta(id='48696e9b-a20d-4bf6-8ac2-6438fe748ab6', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='69702c4b-38f2-49d1-96d5-85671652c67e', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "1a9d8f85-cd26-4e65-b316-4dbc35e89aca", "address": "fa:16:3e:17:68:4f", "network": {"id": "e35f096a-fd75-4d70-ae58-8a76ae666b9d", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1299231587-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b14b3910fae84828afa468e1e645402b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a9d8f85-cd", "ovs_interfaceid": "1a9d8f85-cd26-4e65-b316-4dbc35e89aca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759327464.992049) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Oct 01 14:04:25 compute-0 nova_compute[192698]: 2025-10-01 14:04:25.024 2 DEBUG nova.virt.libvirt.host [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Oct 01 14:04:25 compute-0 nova_compute[192698]: 2025-10-01 14:04:25.024 2 DEBUG nova.virt.libvirt.host [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Oct 01 14:04:25 compute-0 nova_compute[192698]: 2025-10-01 14:04:25.028 2 DEBUG nova.virt.libvirt.host [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Oct 01 14:04:25 compute-0 nova_compute[192698]: 2025-10-01 14:04:25.029 2 DEBUG nova.virt.libvirt.host [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Oct 01 14:04:25 compute-0 nova_compute[192698]: 2025-10-01 14:04:25.030 2 DEBUG nova.virt.libvirt.driver [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Oct 01 14:04:25 compute-0 nova_compute[192698]: 2025-10-01 14:04:25.030 2 DEBUG nova.virt.hardware [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-01T13:57:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='69702c4b-38f2-49d1-96d5-85671652c67e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-01T13:57:39Z,direct_url=<?>,disk_format='qcow2',id=48696e9b-a20d-4bf6-8ac2-6438fe748ab6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9dacac6049d34f02846f752af09ae16f',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-01T13:57:40Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Oct 01 14:04:25 compute-0 nova_compute[192698]: 2025-10-01 14:04:25.031 2 DEBUG nova.virt.hardware [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Oct 01 14:04:25 compute-0 nova_compute[192698]: 2025-10-01 14:04:25.031 2 DEBUG nova.virt.hardware [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Oct 01 14:04:25 compute-0 nova_compute[192698]: 2025-10-01 14:04:25.032 2 DEBUG nova.virt.hardware [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Oct 01 14:04:25 compute-0 nova_compute[192698]: 2025-10-01 14:04:25.032 2 DEBUG nova.virt.hardware [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Oct 01 14:04:25 compute-0 nova_compute[192698]: 2025-10-01 14:04:25.032 2 DEBUG nova.virt.hardware [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Oct 01 14:04:25 compute-0 nova_compute[192698]: 2025-10-01 14:04:25.032 2 DEBUG nova.virt.hardware [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Oct 01 14:04:25 compute-0 nova_compute[192698]: 2025-10-01 14:04:25.033 2 DEBUG nova.virt.hardware [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Oct 01 14:04:25 compute-0 nova_compute[192698]: 2025-10-01 14:04:25.033 2 DEBUG nova.virt.hardware [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Oct 01 14:04:25 compute-0 nova_compute[192698]: 2025-10-01 14:04:25.033 2 DEBUG nova.virt.hardware [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Oct 01 14:04:25 compute-0 nova_compute[192698]: 2025-10-01 14:04:25.034 2 DEBUG nova.virt.hardware [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Oct 01 14:04:25 compute-0 nova_compute[192698]: 2025-10-01 14:04:25.040 2 DEBUG nova.virt.libvirt.vif [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-01T14:04:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-757789601',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-757789601',id=5,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='67079b4774294271895bbf7b04f602e7',ramdisk_id='',reservation_id='r-f6mvgq4u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-2075848047',owner_user_name='tempest-TestExecuteActionsViaActuator-2075848047-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-01T14:04:21Z,user_data=None,user_id='82619989ef1f48a39f1c1e7d64e4cb38',uuid=28407011-1056-4714-96fc-1e8904bbcf1f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1a9d8f85-cd26-4e65-b316-4dbc35e89aca", "address": "fa:16:3e:17:68:4f", "network": {"id": "e35f096a-fd75-4d70-ae58-8a76ae666b9d", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1299231587-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b14b3910fae84828afa468e1e645402b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a9d8f85-cd", "ovs_interfaceid": "1a9d8f85-cd26-4e65-b316-4dbc35e89aca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Oct 01 14:04:25 compute-0 nova_compute[192698]: 2025-10-01 14:04:25.040 2 DEBUG nova.network.os_vif_util [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Converting VIF {"id": "1a9d8f85-cd26-4e65-b316-4dbc35e89aca", "address": "fa:16:3e:17:68:4f", "network": {"id": "e35f096a-fd75-4d70-ae58-8a76ae666b9d", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1299231587-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b14b3910fae84828afa468e1e645402b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a9d8f85-cd", "ovs_interfaceid": "1a9d8f85-cd26-4e65-b316-4dbc35e89aca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:04:25 compute-0 nova_compute[192698]: 2025-10-01 14:04:25.041 2 DEBUG nova.network.os_vif_util [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:17:68:4f,bridge_name='br-int',has_traffic_filtering=True,id=1a9d8f85-cd26-4e65-b316-4dbc35e89aca,network=Network(e35f096a-fd75-4d70-ae58-8a76ae666b9d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1a9d8f85-cd') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:04:25 compute-0 nova_compute[192698]: 2025-10-01 14:04:25.043 2 DEBUG nova.objects.instance [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lazy-loading 'pci_devices' on Instance uuid 28407011-1056-4714-96fc-1e8904bbcf1f obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:04:25 compute-0 nova_compute[192698]: 2025-10-01 14:04:25.553 2 DEBUG nova.virt.libvirt.driver [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] End _get_guest_xml xml=<domain type="kvm">
Oct 01 14:04:25 compute-0 nova_compute[192698]:   <uuid>28407011-1056-4714-96fc-1e8904bbcf1f</uuid>
Oct 01 14:04:25 compute-0 nova_compute[192698]:   <name>instance-00000005</name>
Oct 01 14:04:25 compute-0 nova_compute[192698]:   <memory>131072</memory>
Oct 01 14:04:25 compute-0 nova_compute[192698]:   <vcpu>1</vcpu>
Oct 01 14:04:25 compute-0 nova_compute[192698]:   <metadata>
Oct 01 14:04:25 compute-0 nova_compute[192698]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 01 14:04:25 compute-0 nova_compute[192698]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Oct 01 14:04:25 compute-0 nova_compute[192698]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-757789601</nova:name>
Oct 01 14:04:25 compute-0 nova_compute[192698]:       <nova:creationTime>2025-10-01 14:04:24</nova:creationTime>
Oct 01 14:04:25 compute-0 nova_compute[192698]:       <nova:flavor name="m1.nano" id="69702c4b-38f2-49d1-96d5-85671652c67e">
Oct 01 14:04:25 compute-0 nova_compute[192698]:         <nova:memory>128</nova:memory>
Oct 01 14:04:25 compute-0 nova_compute[192698]:         <nova:disk>1</nova:disk>
Oct 01 14:04:25 compute-0 nova_compute[192698]:         <nova:swap>0</nova:swap>
Oct 01 14:04:25 compute-0 nova_compute[192698]:         <nova:ephemeral>0</nova:ephemeral>
Oct 01 14:04:25 compute-0 nova_compute[192698]:         <nova:vcpus>1</nova:vcpus>
Oct 01 14:04:25 compute-0 nova_compute[192698]:         <nova:extraSpecs>
Oct 01 14:04:25 compute-0 nova_compute[192698]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Oct 01 14:04:25 compute-0 nova_compute[192698]:         </nova:extraSpecs>
Oct 01 14:04:25 compute-0 nova_compute[192698]:       </nova:flavor>
Oct 01 14:04:25 compute-0 nova_compute[192698]:       <nova:image uuid="48696e9b-a20d-4bf6-8ac2-6438fe748ab6">
Oct 01 14:04:25 compute-0 nova_compute[192698]:         <nova:containerFormat>bare</nova:containerFormat>
Oct 01 14:04:25 compute-0 nova_compute[192698]:         <nova:diskFormat>qcow2</nova:diskFormat>
Oct 01 14:04:25 compute-0 nova_compute[192698]:         <nova:minDisk>1</nova:minDisk>
Oct 01 14:04:25 compute-0 nova_compute[192698]:         <nova:minRam>0</nova:minRam>
Oct 01 14:04:25 compute-0 nova_compute[192698]:         <nova:properties>
Oct 01 14:04:25 compute-0 nova_compute[192698]:           <nova:property name="hw_rng_model">virtio</nova:property>
Oct 01 14:04:25 compute-0 nova_compute[192698]:         </nova:properties>
Oct 01 14:04:25 compute-0 nova_compute[192698]:       </nova:image>
Oct 01 14:04:25 compute-0 nova_compute[192698]:       <nova:owner>
Oct 01 14:04:25 compute-0 nova_compute[192698]:         <nova:user uuid="82619989ef1f48a39f1c1e7d64e4cb38">tempest-TestExecuteActionsViaActuator-2075848047-project-admin</nova:user>
Oct 01 14:04:25 compute-0 nova_compute[192698]:         <nova:project uuid="67079b4774294271895bbf7b04f602e7">tempest-TestExecuteActionsViaActuator-2075848047</nova:project>
Oct 01 14:04:25 compute-0 nova_compute[192698]:       </nova:owner>
Oct 01 14:04:25 compute-0 nova_compute[192698]:       <nova:root type="image" uuid="48696e9b-a20d-4bf6-8ac2-6438fe748ab6"/>
Oct 01 14:04:25 compute-0 nova_compute[192698]:       <nova:ports>
Oct 01 14:04:25 compute-0 nova_compute[192698]:         <nova:port uuid="1a9d8f85-cd26-4e65-b316-4dbc35e89aca">
Oct 01 14:04:25 compute-0 nova_compute[192698]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Oct 01 14:04:25 compute-0 nova_compute[192698]:         </nova:port>
Oct 01 14:04:25 compute-0 nova_compute[192698]:       </nova:ports>
Oct 01 14:04:25 compute-0 nova_compute[192698]:     </nova:instance>
Oct 01 14:04:25 compute-0 nova_compute[192698]:   </metadata>
Oct 01 14:04:25 compute-0 nova_compute[192698]:   <sysinfo type="smbios">
Oct 01 14:04:25 compute-0 nova_compute[192698]:     <system>
Oct 01 14:04:25 compute-0 nova_compute[192698]:       <entry name="manufacturer">RDO</entry>
Oct 01 14:04:25 compute-0 nova_compute[192698]:       <entry name="product">OpenStack Compute</entry>
Oct 01 14:04:25 compute-0 nova_compute[192698]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Oct 01 14:04:25 compute-0 nova_compute[192698]:       <entry name="serial">28407011-1056-4714-96fc-1e8904bbcf1f</entry>
Oct 01 14:04:25 compute-0 nova_compute[192698]:       <entry name="uuid">28407011-1056-4714-96fc-1e8904bbcf1f</entry>
Oct 01 14:04:25 compute-0 nova_compute[192698]:       <entry name="family">Virtual Machine</entry>
Oct 01 14:04:25 compute-0 nova_compute[192698]:     </system>
Oct 01 14:04:25 compute-0 nova_compute[192698]:   </sysinfo>
Oct 01 14:04:25 compute-0 nova_compute[192698]:   <os>
Oct 01 14:04:25 compute-0 nova_compute[192698]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 01 14:04:25 compute-0 nova_compute[192698]:     <boot dev="hd"/>
Oct 01 14:04:25 compute-0 nova_compute[192698]:     <smbios mode="sysinfo"/>
Oct 01 14:04:25 compute-0 nova_compute[192698]:   </os>
Oct 01 14:04:25 compute-0 nova_compute[192698]:   <features>
Oct 01 14:04:25 compute-0 nova_compute[192698]:     <acpi/>
Oct 01 14:04:25 compute-0 nova_compute[192698]:     <apic/>
Oct 01 14:04:25 compute-0 nova_compute[192698]:     <vmcoreinfo/>
Oct 01 14:04:25 compute-0 nova_compute[192698]:   </features>
Oct 01 14:04:25 compute-0 nova_compute[192698]:   <clock offset="utc">
Oct 01 14:04:25 compute-0 nova_compute[192698]:     <timer name="pit" tickpolicy="delay"/>
Oct 01 14:04:25 compute-0 nova_compute[192698]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 01 14:04:25 compute-0 nova_compute[192698]:     <timer name="hpet" present="no"/>
Oct 01 14:04:25 compute-0 nova_compute[192698]:   </clock>
Oct 01 14:04:25 compute-0 nova_compute[192698]:   <cpu mode="host-model" match="exact">
Oct 01 14:04:25 compute-0 nova_compute[192698]:     <topology sockets="1" cores="1" threads="1"/>
Oct 01 14:04:25 compute-0 nova_compute[192698]:   </cpu>
Oct 01 14:04:25 compute-0 nova_compute[192698]:   <devices>
Oct 01 14:04:25 compute-0 nova_compute[192698]:     <disk type="file" device="disk">
Oct 01 14:04:25 compute-0 nova_compute[192698]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 01 14:04:25 compute-0 nova_compute[192698]:       <source file="/var/lib/nova/instances/28407011-1056-4714-96fc-1e8904bbcf1f/disk"/>
Oct 01 14:04:25 compute-0 nova_compute[192698]:       <target dev="vda" bus="virtio"/>
Oct 01 14:04:25 compute-0 nova_compute[192698]:     </disk>
Oct 01 14:04:25 compute-0 nova_compute[192698]:     <disk type="file" device="cdrom">
Oct 01 14:04:25 compute-0 nova_compute[192698]:       <driver name="qemu" type="raw" cache="none"/>
Oct 01 14:04:25 compute-0 nova_compute[192698]:       <source file="/var/lib/nova/instances/28407011-1056-4714-96fc-1e8904bbcf1f/disk.config"/>
Oct 01 14:04:25 compute-0 nova_compute[192698]:       <target dev="sda" bus="sata"/>
Oct 01 14:04:25 compute-0 nova_compute[192698]:     </disk>
Oct 01 14:04:25 compute-0 nova_compute[192698]:     <interface type="ethernet">
Oct 01 14:04:25 compute-0 nova_compute[192698]:       <mac address="fa:16:3e:17:68:4f"/>
Oct 01 14:04:25 compute-0 nova_compute[192698]:       <model type="virtio"/>
Oct 01 14:04:25 compute-0 nova_compute[192698]:       <driver name="vhost" rx_queue_size="512"/>
Oct 01 14:04:25 compute-0 nova_compute[192698]:       <mtu size="1442"/>
Oct 01 14:04:25 compute-0 nova_compute[192698]:       <target dev="tap1a9d8f85-cd"/>
Oct 01 14:04:25 compute-0 nova_compute[192698]:     </interface>
Oct 01 14:04:25 compute-0 nova_compute[192698]:     <serial type="pty">
Oct 01 14:04:25 compute-0 nova_compute[192698]:       <log file="/var/lib/nova/instances/28407011-1056-4714-96fc-1e8904bbcf1f/console.log" append="off"/>
Oct 01 14:04:25 compute-0 nova_compute[192698]:     </serial>
Oct 01 14:04:25 compute-0 nova_compute[192698]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 01 14:04:25 compute-0 nova_compute[192698]:     <video>
Oct 01 14:04:25 compute-0 nova_compute[192698]:       <model type="virtio"/>
Oct 01 14:04:25 compute-0 nova_compute[192698]:     </video>
Oct 01 14:04:25 compute-0 nova_compute[192698]:     <input type="tablet" bus="usb"/>
Oct 01 14:04:25 compute-0 nova_compute[192698]:     <rng model="virtio">
Oct 01 14:04:25 compute-0 nova_compute[192698]:       <backend model="random">/dev/urandom</backend>
Oct 01 14:04:25 compute-0 nova_compute[192698]:     </rng>
Oct 01 14:04:25 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root"/>
Oct 01 14:04:25 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:04:25 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:04:25 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:04:25 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:04:25 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:04:25 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:04:25 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:04:25 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:04:25 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:04:25 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:04:25 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:04:25 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:04:25 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:04:25 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:04:25 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:04:25 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:04:25 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:04:25 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:04:25 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:04:25 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:04:25 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:04:25 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:04:25 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:04:25 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:04:25 compute-0 nova_compute[192698]:     <controller type="usb" index="0"/>
Oct 01 14:04:25 compute-0 nova_compute[192698]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Oct 01 14:04:25 compute-0 nova_compute[192698]:       <stats period="10"/>
Oct 01 14:04:25 compute-0 nova_compute[192698]:     </memballoon>
Oct 01 14:04:25 compute-0 nova_compute[192698]:   </devices>
Oct 01 14:04:25 compute-0 nova_compute[192698]: </domain>
Oct 01 14:04:25 compute-0 nova_compute[192698]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Oct 01 14:04:25 compute-0 nova_compute[192698]: 2025-10-01 14:04:25.556 2 DEBUG nova.compute.manager [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] Preparing to wait for external event network-vif-plugged-1a9d8f85-cd26-4e65-b316-4dbc35e89aca prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Oct 01 14:04:25 compute-0 nova_compute[192698]: 2025-10-01 14:04:25.557 2 DEBUG oslo_concurrency.lockutils [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Acquiring lock "28407011-1056-4714-96fc-1e8904bbcf1f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:04:25 compute-0 nova_compute[192698]: 2025-10-01 14:04:25.557 2 DEBUG oslo_concurrency.lockutils [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "28407011-1056-4714-96fc-1e8904bbcf1f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:04:25 compute-0 nova_compute[192698]: 2025-10-01 14:04:25.558 2 DEBUG oslo_concurrency.lockutils [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "28407011-1056-4714-96fc-1e8904bbcf1f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:04:25 compute-0 nova_compute[192698]: 2025-10-01 14:04:25.559 2 DEBUG nova.virt.libvirt.vif [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-01T14:04:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-757789601',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-757789601',id=5,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='67079b4774294271895bbf7b04f602e7',ramdisk_id='',reservation_id='r-f6mvgq4u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-2075848047',owner_user_name='tempest-TestExecuteActionsViaActuator-2075848047-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-01T14:04:21Z,user_data=None,user_id='82619989ef1f48a39f1c1e7d64e4cb38',uuid=28407011-1056-4714-96fc-1e8904bbcf1f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1a9d8f85-cd26-4e65-b316-4dbc35e89aca", "address": "fa:16:3e:17:68:4f", "network": {"id": "e35f096a-fd75-4d70-ae58-8a76ae666b9d", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1299231587-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b14b3910fae84828afa468e1e645402b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a9d8f85-cd", "ovs_interfaceid": "1a9d8f85-cd26-4e65-b316-4dbc35e89aca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 01 14:04:25 compute-0 nova_compute[192698]: 2025-10-01 14:04:25.560 2 DEBUG nova.network.os_vif_util [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Converting VIF {"id": "1a9d8f85-cd26-4e65-b316-4dbc35e89aca", "address": "fa:16:3e:17:68:4f", "network": {"id": "e35f096a-fd75-4d70-ae58-8a76ae666b9d", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1299231587-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b14b3910fae84828afa468e1e645402b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a9d8f85-cd", "ovs_interfaceid": "1a9d8f85-cd26-4e65-b316-4dbc35e89aca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:04:25 compute-0 nova_compute[192698]: 2025-10-01 14:04:25.561 2 DEBUG nova.network.os_vif_util [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:17:68:4f,bridge_name='br-int',has_traffic_filtering=True,id=1a9d8f85-cd26-4e65-b316-4dbc35e89aca,network=Network(e35f096a-fd75-4d70-ae58-8a76ae666b9d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1a9d8f85-cd') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:04:25 compute-0 nova_compute[192698]: 2025-10-01 14:04:25.562 2 DEBUG os_vif [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:17:68:4f,bridge_name='br-int',has_traffic_filtering=True,id=1a9d8f85-cd26-4e65-b316-4dbc35e89aca,network=Network(e35f096a-fd75-4d70-ae58-8a76ae666b9d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1a9d8f85-cd') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 01 14:04:25 compute-0 nova_compute[192698]: 2025-10-01 14:04:25.563 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:04:25 compute-0 nova_compute[192698]: 2025-10-01 14:04:25.564 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:04:25 compute-0 nova_compute[192698]: 2025-10-01 14:04:25.564 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:04:25 compute-0 nova_compute[192698]: 2025-10-01 14:04:25.566 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:04:25 compute-0 nova_compute[192698]: 2025-10-01 14:04:25.566 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '619de795-73fe-5300-8a71-88c9b85ea47f', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:04:25 compute-0 nova_compute[192698]: 2025-10-01 14:04:25.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:04:25 compute-0 nova_compute[192698]: 2025-10-01 14:04:25.616 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:04:25 compute-0 nova_compute[192698]: 2025-10-01 14:04:25.619 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:04:25 compute-0 nova_compute[192698]: 2025-10-01 14:04:25.620 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1a9d8f85-cd, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:04:25 compute-0 nova_compute[192698]: 2025-10-01 14:04:25.620 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap1a9d8f85-cd, col_values=(('qos', UUID('b3b25e8c-1058-4fb0-9d1b-fca029736e41')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:04:25 compute-0 nova_compute[192698]: 2025-10-01 14:04:25.621 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap1a9d8f85-cd, col_values=(('external_ids', {'iface-id': '1a9d8f85-cd26-4e65-b316-4dbc35e89aca', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:17:68:4f', 'vm-uuid': '28407011-1056-4714-96fc-1e8904bbcf1f'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:04:25 compute-0 nova_compute[192698]: 2025-10-01 14:04:25.622 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:04:25 compute-0 nova_compute[192698]: 2025-10-01 14:04:25.623 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:04:25 compute-0 NetworkManager[51741]: <info>  [1759327465.6240] manager: (tap1a9d8f85-cd): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/26)
Oct 01 14:04:25 compute-0 nova_compute[192698]: 2025-10-01 14:04:25.634 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:04:25 compute-0 nova_compute[192698]: 2025-10-01 14:04:25.635 2 INFO os_vif [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:17:68:4f,bridge_name='br-int',has_traffic_filtering=True,id=1a9d8f85-cd26-4e65-b316-4dbc35e89aca,network=Network(e35f096a-fd75-4d70-ae58-8a76ae666b9d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1a9d8f85-cd')
Oct 01 14:04:27 compute-0 nova_compute[192698]: 2025-10-01 14:04:27.188 2 DEBUG nova.virt.libvirt.driver [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 01 14:04:27 compute-0 nova_compute[192698]: 2025-10-01 14:04:27.188 2 DEBUG nova.virt.libvirt.driver [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 01 14:04:27 compute-0 nova_compute[192698]: 2025-10-01 14:04:27.189 2 DEBUG nova.virt.libvirt.driver [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] No VIF found with MAC fa:16:3e:17:68:4f, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Oct 01 14:04:27 compute-0 nova_compute[192698]: 2025-10-01 14:04:27.190 2 INFO nova.virt.libvirt.driver [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] Using config drive
Oct 01 14:04:27 compute-0 nova_compute[192698]: 2025-10-01 14:04:27.434 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:04:27 compute-0 nova_compute[192698]: 2025-10-01 14:04:27.703 2 WARNING neutronclient.v2_0.client [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:04:28 compute-0 nova_compute[192698]: 2025-10-01 14:04:28.044 2 INFO nova.virt.libvirt.driver [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] Creating config drive at /var/lib/nova/instances/28407011-1056-4714-96fc-1e8904bbcf1f/disk.config
Oct 01 14:04:28 compute-0 nova_compute[192698]: 2025-10-01 14:04:28.054 2 DEBUG oslo_concurrency.processutils [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/28407011-1056-4714-96fc-1e8904bbcf1f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpm2gm5ohb execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:04:28 compute-0 nova_compute[192698]: 2025-10-01 14:04:28.197 2 DEBUG oslo_concurrency.processutils [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/28407011-1056-4714-96fc-1e8904bbcf1f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpm2gm5ohb" returned: 0 in 0.143s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:04:28 compute-0 kernel: tap1a9d8f85-cd: entered promiscuous mode
Oct 01 14:04:28 compute-0 NetworkManager[51741]: <info>  [1759327468.2934] manager: (tap1a9d8f85-cd): new Tun device (/org/freedesktop/NetworkManager/Devices/27)
Oct 01 14:04:28 compute-0 ovn_controller[94909]: 2025-10-01T14:04:28Z|00049|binding|INFO|Claiming lport 1a9d8f85-cd26-4e65-b316-4dbc35e89aca for this chassis.
Oct 01 14:04:28 compute-0 ovn_controller[94909]: 2025-10-01T14:04:28Z|00050|binding|INFO|1a9d8f85-cd26-4e65-b316-4dbc35e89aca: Claiming fa:16:3e:17:68:4f 10.100.0.3
Oct 01 14:04:28 compute-0 nova_compute[192698]: 2025-10-01 14:04:28.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:04:28 compute-0 nova_compute[192698]: 2025-10-01 14:04:28.332 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:04:28.346 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:17:68:4f 10.100.0.3'], port_security=['fa:16:3e:17:68:4f 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '28407011-1056-4714-96fc-1e8904bbcf1f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e35f096a-fd75-4d70-ae58-8a76ae666b9d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '67079b4774294271895bbf7b04f602e7', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'de7872a1-1f76-4b0f-8bd9-119520ff7a88', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e3a455d-1f77-441e-b08a-0ec8231910e5, chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], logical_port=1a9d8f85-cd26-4e65-b316-4dbc35e89aca) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:04:28.348 103791 INFO neutron.agent.ovn.metadata.agent [-] Port 1a9d8f85-cd26-4e65-b316-4dbc35e89aca in datapath e35f096a-fd75-4d70-ae58-8a76ae666b9d bound to our chassis
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:04:28.350 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e35f096a-fd75-4d70-ae58-8a76ae666b9d
Oct 01 14:04:28 compute-0 systemd-udevd[216371]: Network interface NamePolicy= disabled on kernel command line.
Oct 01 14:04:28 compute-0 NetworkManager[51741]: <info>  [1759327468.3667] device (tap1a9d8f85-cd): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 01 14:04:28 compute-0 NetworkManager[51741]: <info>  [1759327468.3676] device (tap1a9d8f85-cd): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:04:28.373 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[d9abaddd-dced-4959-8111-2494b3a8c3a5]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:04:28.374 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape35f096a-f1 in ovnmeta-e35f096a-fd75-4d70-ae58-8a76ae666b9d namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Oct 01 14:04:28 compute-0 systemd-machined[152704]: New machine qemu-2-instance-00000005.
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:04:28.383 214114 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape35f096a-f0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:04:28.384 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[5236c892-8355-403c-a9a2-9486c3514d8b]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:04:28.385 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[a576733f-505c-468e-a49d-b9568ffaff88]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:04:28.407 103910 DEBUG oslo.privsep.daemon [-] privsep: reply[928afe11-d08b-49ad-a977-5f53aacbc99d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:04:28 compute-0 ovn_controller[94909]: 2025-10-01T14:04:28Z|00051|binding|INFO|Setting lport 1a9d8f85-cd26-4e65-b316-4dbc35e89aca ovn-installed in OVS
Oct 01 14:04:28 compute-0 ovn_controller[94909]: 2025-10-01T14:04:28Z|00052|binding|INFO|Setting lport 1a9d8f85-cd26-4e65-b316-4dbc35e89aca up in Southbound
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:04:28.420 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[83bf4aa7-29fe-4fb1-bafb-31267db67e8e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:04:28 compute-0 nova_compute[192698]: 2025-10-01 14:04:28.422 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:04:28 compute-0 systemd[1]: Started Virtual Machine qemu-2-instance-00000005.
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:04:28.470 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[afb6c32c-f91f-49eb-8cca-c9956f17f62c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:04:28.478 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[d616c2ae-cbcd-4be0-bb18-b40e2c3c8077]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:04:28 compute-0 NetworkManager[51741]: <info>  [1759327468.4801] manager: (tape35f096a-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/28)
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:04:28.539 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[df07d191-0df8-4c4f-a322-b609b4ce7aa6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:04:28.544 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[39402a20-829c-4cdf-9eee-5016cfd5b533]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:04:28 compute-0 NetworkManager[51741]: <info>  [1759327468.5889] device (tape35f096a-f0): carrier: link connected
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:04:28.597 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[acb80412-b8b1-4942-a7b2-2853a25328e9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:04:28.624 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[517d1f3d-e314-4928-8bea-1a0a639bb0e3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape35f096a-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:47:1b:50'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 382912, 'reachable_time': 27351, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216405, 'error': None, 'target': 'ovnmeta-e35f096a-fd75-4d70-ae58-8a76ae666b9d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:04:28.651 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[f6e5b396-fdd5-43a1-9136-aaae4baace0c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe47:1b50'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 382912, 'tstamp': 382912}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216406, 'error': None, 'target': 'ovnmeta-e35f096a-fd75-4d70-ae58-8a76ae666b9d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:04:28.683 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[155aee74-7483-4e89-b6ab-27a6164fe941]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape35f096a-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:47:1b:50'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 382912, 'reachable_time': 27351, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216407, 'error': None, 'target': 'ovnmeta-e35f096a-fd75-4d70-ae58-8a76ae666b9d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:04:28.737 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[a026deaf-5f74-4e10-a6df-bc0ad826f8db]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:04:28.845 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[bf03308f-4ec0-463e-8d93-9f4f277e17f1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:04:28.847 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape35f096a-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:04:28.847 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:04:28.848 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape35f096a-f0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:04:28 compute-0 nova_compute[192698]: 2025-10-01 14:04:28.850 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:04:28 compute-0 kernel: tape35f096a-f0: entered promiscuous mode
Oct 01 14:04:28 compute-0 NetworkManager[51741]: <info>  [1759327468.8528] manager: (tape35f096a-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/29)
Oct 01 14:04:28 compute-0 nova_compute[192698]: 2025-10-01 14:04:28.854 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:04:28.855 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape35f096a-f0, col_values=(('external_ids', {'iface-id': '3f9111f1-79b1-4bf1-bb95-d924c71fb42c'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:04:28 compute-0 ovn_controller[94909]: 2025-10-01T14:04:28Z|00053|binding|INFO|Releasing lport 3f9111f1-79b1-4bf1-bb95-d924c71fb42c from this chassis (sb_readonly=0)
Oct 01 14:04:28 compute-0 nova_compute[192698]: 2025-10-01 14:04:28.857 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:04:28 compute-0 nova_compute[192698]: 2025-10-01 14:04:28.883 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:04:28.885 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[f950a53d-8821-4def-b5a2-da151d62da66]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:04:28.886 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e35f096a-fd75-4d70-ae58-8a76ae666b9d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e35f096a-fd75-4d70-ae58-8a76ae666b9d.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:04:28.886 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e35f096a-fd75-4d70-ae58-8a76ae666b9d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e35f096a-fd75-4d70-ae58-8a76ae666b9d.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:04:28.887 103791 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for e35f096a-fd75-4d70-ae58-8a76ae666b9d disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:04:28.887 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e35f096a-fd75-4d70-ae58-8a76ae666b9d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e35f096a-fd75-4d70-ae58-8a76ae666b9d.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:04:28.887 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[770773ab-3537-4269-9464-7a83564b2046]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:04:28.888 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e35f096a-fd75-4d70-ae58-8a76ae666b9d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e35f096a-fd75-4d70-ae58-8a76ae666b9d.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:04:28.888 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[ef43890d-25f3-42e3-aac9-d36a88e7b1ba]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:04:28.889 103791 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]: global
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]:     log         /dev/log local0 debug
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]:     log-tag     haproxy-metadata-proxy-e35f096a-fd75-4d70-ae58-8a76ae666b9d
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]:     user        root
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]:     group       root
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]:     maxconn     1024
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]:     pidfile     /var/lib/neutron/external/pids/e35f096a-fd75-4d70-ae58-8a76ae666b9d.pid.haproxy
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]:     daemon
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]: 
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]: defaults
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]:     log global
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]:     mode http
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]:     option httplog
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]:     option dontlognull
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]:     option http-server-close
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]:     option forwardfor
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]:     retries                 3
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]:     timeout http-request    30s
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]:     timeout connect         30s
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]:     timeout client          32s
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]:     timeout server          32s
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]:     timeout http-keep-alive 30s
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]: 
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]: listen listener
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]:     bind 169.254.169.254:80
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]:     
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]:     server metadata /var/lib/neutron/metadata_proxy
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]: 
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]:     http-request add-header X-OVN-Network-ID e35f096a-fd75-4d70-ae58-8a76ae666b9d
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Oct 01 14:04:28 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:04:28.889 103791 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e35f096a-fd75-4d70-ae58-8a76ae666b9d', 'env', 'PROCESS_TAG=haproxy-e35f096a-fd75-4d70-ae58-8a76ae666b9d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e35f096a-fd75-4d70-ae58-8a76ae666b9d.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Oct 01 14:04:29 compute-0 nova_compute[192698]: 2025-10-01 14:04:29.094 2 DEBUG nova.compute.manager [req-f7ba2958-35a4-4b1d-ad69-00b79e803647 req-f06a7dd9-5c59-4edc-83e1-e9172161f514 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] Received event network-vif-plugged-1a9d8f85-cd26-4e65-b316-4dbc35e89aca external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:04:29 compute-0 nova_compute[192698]: 2025-10-01 14:04:29.095 2 DEBUG oslo_concurrency.lockutils [req-f7ba2958-35a4-4b1d-ad69-00b79e803647 req-f06a7dd9-5c59-4edc-83e1-e9172161f514 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "28407011-1056-4714-96fc-1e8904bbcf1f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:04:29 compute-0 nova_compute[192698]: 2025-10-01 14:04:29.095 2 DEBUG oslo_concurrency.lockutils [req-f7ba2958-35a4-4b1d-ad69-00b79e803647 req-f06a7dd9-5c59-4edc-83e1-e9172161f514 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "28407011-1056-4714-96fc-1e8904bbcf1f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:04:29 compute-0 nova_compute[192698]: 2025-10-01 14:04:29.096 2 DEBUG oslo_concurrency.lockutils [req-f7ba2958-35a4-4b1d-ad69-00b79e803647 req-f06a7dd9-5c59-4edc-83e1-e9172161f514 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "28407011-1056-4714-96fc-1e8904bbcf1f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:04:29 compute-0 nova_compute[192698]: 2025-10-01 14:04:29.104 2 DEBUG nova.compute.manager [req-f7ba2958-35a4-4b1d-ad69-00b79e803647 req-f06a7dd9-5c59-4edc-83e1-e9172161f514 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] Processing event network-vif-plugged-1a9d8f85-cd26-4e65-b316-4dbc35e89aca _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Oct 01 14:04:29 compute-0 podman[216446]: 2025-10-01 14:04:29.38666758 +0000 UTC m=+0.087869656 container create 0aa86536455b16e34aad8f5f0a422ec17305f86e51f113097ce8e68d15c93360 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-e35f096a-fd75-4d70-ae58-8a76ae666b9d, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930)
Oct 01 14:04:29 compute-0 nova_compute[192698]: 2025-10-01 14:04:29.428 2 DEBUG nova.compute.manager [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Oct 01 14:04:29 compute-0 nova_compute[192698]: 2025-10-01 14:04:29.435 2 DEBUG nova.virt.libvirt.driver [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Oct 01 14:04:29 compute-0 podman[216446]: 2025-10-01 14:04:29.345337013 +0000 UTC m=+0.046539139 image pull 0c139338a67144a0d88e07ef5f38b20d3085af4a1586fd8115d3776c8f9c633c 38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Oct 01 14:04:29 compute-0 nova_compute[192698]: 2025-10-01 14:04:29.438 2 INFO nova.virt.libvirt.driver [-] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] Instance spawned successfully.
Oct 01 14:04:29 compute-0 nova_compute[192698]: 2025-10-01 14:04:29.439 2 DEBUG nova.virt.libvirt.driver [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Oct 01 14:04:29 compute-0 systemd[1]: Started libpod-conmon-0aa86536455b16e34aad8f5f0a422ec17305f86e51f113097ce8e68d15c93360.scope.
Oct 01 14:04:29 compute-0 systemd[1]: Started libcrun container.
Oct 01 14:04:29 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0174d47f382f2361024b4c63aef3c38862dee1d030ae143933ff05834db39753/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 01 14:04:29 compute-0 podman[216446]: 2025-10-01 14:04:29.511015211 +0000 UTC m=+0.212217347 container init 0aa86536455b16e34aad8f5f0a422ec17305f86e51f113097ce8e68d15c93360 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-e35f096a-fd75-4d70-ae58-8a76ae666b9d, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Oct 01 14:04:29 compute-0 podman[216446]: 2025-10-01 14:04:29.522243854 +0000 UTC m=+0.223445930 container start 0aa86536455b16e34aad8f5f0a422ec17305f86e51f113097ce8e68d15c93360 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-e35f096a-fd75-4d70-ae58-8a76ae666b9d, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930)
Oct 01 14:04:29 compute-0 neutron-haproxy-ovnmeta-e35f096a-fd75-4d70-ae58-8a76ae666b9d[216462]: [NOTICE]   (216466) : New worker (216468) forked
Oct 01 14:04:29 compute-0 neutron-haproxy-ovnmeta-e35f096a-fd75-4d70-ae58-8a76ae666b9d[216462]: [NOTICE]   (216466) : Loading success.
Oct 01 14:04:29 compute-0 podman[203144]: time="2025-10-01T14:04:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:04:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:04:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20750 "" "Go-http-client/1.1"
Oct 01 14:04:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:04:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3455 "" "Go-http-client/1.1"
Oct 01 14:04:29 compute-0 nova_compute[192698]: 2025-10-01 14:04:29.957 2 DEBUG nova.virt.libvirt.driver [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:04:29 compute-0 nova_compute[192698]: 2025-10-01 14:04:29.958 2 DEBUG nova.virt.libvirt.driver [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:04:29 compute-0 nova_compute[192698]: 2025-10-01 14:04:29.958 2 DEBUG nova.virt.libvirt.driver [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:04:29 compute-0 nova_compute[192698]: 2025-10-01 14:04:29.958 2 DEBUG nova.virt.libvirt.driver [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:04:29 compute-0 nova_compute[192698]: 2025-10-01 14:04:29.959 2 DEBUG nova.virt.libvirt.driver [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:04:29 compute-0 nova_compute[192698]: 2025-10-01 14:04:29.959 2 DEBUG nova.virt.libvirt.driver [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:04:30 compute-0 nova_compute[192698]: 2025-10-01 14:04:30.470 2 INFO nova.compute.manager [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] Took 7.64 seconds to spawn the instance on the hypervisor.
Oct 01 14:04:30 compute-0 nova_compute[192698]: 2025-10-01 14:04:30.472 2 DEBUG nova.compute.manager [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 01 14:04:30 compute-0 nova_compute[192698]: 2025-10-01 14:04:30.624 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:04:31 compute-0 nova_compute[192698]: 2025-10-01 14:04:31.017 2 INFO nova.compute.manager [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] Took 12.97 seconds to build instance.
Oct 01 14:04:31 compute-0 nova_compute[192698]: 2025-10-01 14:04:31.162 2 DEBUG nova.compute.manager [req-4afdd2e8-ed42-400e-8e42-52893e04bc3b req-13b89daa-a2d8-4cf3-9c95-43084eb294cf 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] Received event network-vif-plugged-1a9d8f85-cd26-4e65-b316-4dbc35e89aca external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:04:31 compute-0 nova_compute[192698]: 2025-10-01 14:04:31.163 2 DEBUG oslo_concurrency.lockutils [req-4afdd2e8-ed42-400e-8e42-52893e04bc3b req-13b89daa-a2d8-4cf3-9c95-43084eb294cf 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "28407011-1056-4714-96fc-1e8904bbcf1f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:04:31 compute-0 nova_compute[192698]: 2025-10-01 14:04:31.166 2 DEBUG oslo_concurrency.lockutils [req-4afdd2e8-ed42-400e-8e42-52893e04bc3b req-13b89daa-a2d8-4cf3-9c95-43084eb294cf 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "28407011-1056-4714-96fc-1e8904bbcf1f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.003s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:04:31 compute-0 nova_compute[192698]: 2025-10-01 14:04:31.167 2 DEBUG oslo_concurrency.lockutils [req-4afdd2e8-ed42-400e-8e42-52893e04bc3b req-13b89daa-a2d8-4cf3-9c95-43084eb294cf 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "28407011-1056-4714-96fc-1e8904bbcf1f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:04:31 compute-0 nova_compute[192698]: 2025-10-01 14:04:31.168 2 DEBUG nova.compute.manager [req-4afdd2e8-ed42-400e-8e42-52893e04bc3b req-13b89daa-a2d8-4cf3-9c95-43084eb294cf 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] No waiting events found dispatching network-vif-plugged-1a9d8f85-cd26-4e65-b316-4dbc35e89aca pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:04:31 compute-0 nova_compute[192698]: 2025-10-01 14:04:31.169 2 WARNING nova.compute.manager [req-4afdd2e8-ed42-400e-8e42-52893e04bc3b req-13b89daa-a2d8-4cf3-9c95-43084eb294cf 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] Received unexpected event network-vif-plugged-1a9d8f85-cd26-4e65-b316-4dbc35e89aca for instance with vm_state active and task_state None.
Oct 01 14:04:31 compute-0 openstack_network_exporter[205307]: ERROR   14:04:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:04:31 compute-0 openstack_network_exporter[205307]: ERROR   14:04:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:04:31 compute-0 openstack_network_exporter[205307]: ERROR   14:04:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:04:31 compute-0 openstack_network_exporter[205307]: ERROR   14:04:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:04:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:04:31 compute-0 openstack_network_exporter[205307]: ERROR   14:04:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:04:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:04:31 compute-0 nova_compute[192698]: 2025-10-01 14:04:31.524 2 DEBUG oslo_concurrency.lockutils [None req-95424ee5-fe48-4f40-87d2-ca7019391580 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "28407011-1056-4714-96fc-1e8904bbcf1f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.499s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:04:32 compute-0 podman[216477]: 2025-10-01 14:04:32.17845712 +0000 UTC m=+0.075676506 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 01 14:04:32 compute-0 nova_compute[192698]: 2025-10-01 14:04:32.471 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:04:35 compute-0 nova_compute[192698]: 2025-10-01 14:04:35.628 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:04:37 compute-0 nova_compute[192698]: 2025-10-01 14:04:37.473 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:04:40 compute-0 nova_compute[192698]: 2025-10-01 14:04:40.631 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:04:41 compute-0 podman[216522]: 2025-10-01 14:04:41.149795717 +0000 UTC m=+0.060284011 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Oct 01 14:04:41 compute-0 podman[216523]: 2025-10-01 14:04:41.24610881 +0000 UTC m=+0.140507629 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Oct 01 14:04:42 compute-0 ovn_controller[94909]: 2025-10-01T14:04:42Z|00003|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:17:68:4f 10.100.0.3
Oct 01 14:04:42 compute-0 ovn_controller[94909]: 2025-10-01T14:04:42Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:17:68:4f 10.100.0.3
Oct 01 14:04:42 compute-0 nova_compute[192698]: 2025-10-01 14:04:42.499 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:04:45 compute-0 nova_compute[192698]: 2025-10-01 14:04:45.635 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:04:47 compute-0 nova_compute[192698]: 2025-10-01 14:04:47.543 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:04:49 compute-0 podman[216567]: 2025-10-01 14:04:49.168761023 +0000 UTC m=+0.079879050 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., config_id=edpm, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct 01 14:04:50 compute-0 nova_compute[192698]: 2025-10-01 14:04:50.640 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:04:52 compute-0 nova_compute[192698]: 2025-10-01 14:04:52.544 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:04:55 compute-0 podman[216588]: 2025-10-01 14:04:55.177206375 +0000 UTC m=+0.097563968 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Oct 01 14:04:55 compute-0 podman[216589]: 2025-10-01 14:04:55.199982091 +0000 UTC m=+0.108725330 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4)
Oct 01 14:04:55 compute-0 nova_compute[192698]: 2025-10-01 14:04:55.646 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:04:57 compute-0 nova_compute[192698]: 2025-10-01 14:04:57.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:04:59 compute-0 podman[203144]: time="2025-10-01T14:04:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:04:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:04:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20750 "" "Go-http-client/1.1"
Oct 01 14:04:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:04:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3462 "" "Go-http-client/1.1"
Oct 01 14:05:00 compute-0 nova_compute[192698]: 2025-10-01 14:05:00.650 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:05:01 compute-0 openstack_network_exporter[205307]: ERROR   14:05:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:05:01 compute-0 openstack_network_exporter[205307]: ERROR   14:05:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:05:01 compute-0 openstack_network_exporter[205307]: ERROR   14:05:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:05:01 compute-0 openstack_network_exporter[205307]: ERROR   14:05:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:05:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:05:01 compute-0 openstack_network_exporter[205307]: ERROR   14:05:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:05:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:05:02 compute-0 nova_compute[192698]: 2025-10-01 14:05:02.548 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:05:03 compute-0 podman[216628]: 2025-10-01 14:05:03.158882583 +0000 UTC m=+0.075120491 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 01 14:05:05 compute-0 nova_compute[192698]: 2025-10-01 14:05:04.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:05:05 compute-0 nova_compute[192698]: 2025-10-01 14:05:05.653 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:05:06 compute-0 nova_compute[192698]: 2025-10-01 14:05:06.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:05:07 compute-0 nova_compute[192698]: 2025-10-01 14:05:07.554 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:05:07 compute-0 nova_compute[192698]: 2025-10-01 14:05:07.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:05:07 compute-0 nova_compute[192698]: 2025-10-01 14:05:07.926 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:05:08 compute-0 nova_compute[192698]: 2025-10-01 14:05:08.440 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:05:08 compute-0 nova_compute[192698]: 2025-10-01 14:05:08.442 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:05:08 compute-0 nova_compute[192698]: 2025-10-01 14:05:08.442 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:05:08 compute-0 nova_compute[192698]: 2025-10-01 14:05:08.443 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 01 14:05:09 compute-0 nova_compute[192698]: 2025-10-01 14:05:09.514 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/28407011-1056-4714-96fc-1e8904bbcf1f/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:05:09 compute-0 nova_compute[192698]: 2025-10-01 14:05:09.592 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/28407011-1056-4714-96fc-1e8904bbcf1f/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:05:09 compute-0 nova_compute[192698]: 2025-10-01 14:05:09.594 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/28407011-1056-4714-96fc-1e8904bbcf1f/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:05:09 compute-0 nova_compute[192698]: 2025-10-01 14:05:09.684 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/28407011-1056-4714-96fc-1e8904bbcf1f/disk --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:05:09 compute-0 nova_compute[192698]: 2025-10-01 14:05:09.931 2 WARNING nova.virt.libvirt.driver [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:05:09 compute-0 nova_compute[192698]: 2025-10-01 14:05:09.933 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:05:09 compute-0 nova_compute[192698]: 2025-10-01 14:05:09.966 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.033s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:05:09 compute-0 nova_compute[192698]: 2025-10-01 14:05:09.967 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5700MB free_disk=73.27809524536133GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 01 14:05:09 compute-0 nova_compute[192698]: 2025-10-01 14:05:09.968 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:05:09 compute-0 nova_compute[192698]: 2025-10-01 14:05:09.968 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:05:10 compute-0 nova_compute[192698]: 2025-10-01 14:05:10.655 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:05:11 compute-0 nova_compute[192698]: 2025-10-01 14:05:11.057 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Instance 28407011-1056-4714-96fc-1e8904bbcf1f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Oct 01 14:05:11 compute-0 nova_compute[192698]: 2025-10-01 14:05:11.057 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 01 14:05:11 compute-0 nova_compute[192698]: 2025-10-01 14:05:11.058 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:05:09 up  1:04,  0 user,  load average: 0.42, 0.38, 0.51\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_67079b4774294271895bbf7b04f602e7': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 01 14:05:11 compute-0 nova_compute[192698]: 2025-10-01 14:05:11.106 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:05:11 compute-0 nova_compute[192698]: 2025-10-01 14:05:11.616 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:05:12 compute-0 nova_compute[192698]: 2025-10-01 14:05:12.125 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 01 14:05:12 compute-0 nova_compute[192698]: 2025-10-01 14:05:12.125 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.157s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:05:12 compute-0 podman[216663]: 2025-10-01 14:05:12.180120539 +0000 UTC m=+0.089013047 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Oct 01 14:05:12 compute-0 podman[216664]: 2025-10-01 14:05:12.238412925 +0000 UTC m=+0.136248254 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 01 14:05:12 compute-0 nova_compute[192698]: 2025-10-01 14:05:12.556 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:05:13 compute-0 nova_compute[192698]: 2025-10-01 14:05:13.114 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:05:13 compute-0 nova_compute[192698]: 2025-10-01 14:05:13.114 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:05:13 compute-0 nova_compute[192698]: 2025-10-01 14:05:13.115 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:05:13 compute-0 nova_compute[192698]: 2025-10-01 14:05:13.115 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:05:13 compute-0 nova_compute[192698]: 2025-10-01 14:05:13.115 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 01 14:05:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:05:14.233 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:05:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:05:14.234 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:05:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:05:14.236 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.003s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:05:14 compute-0 nova_compute[192698]: 2025-10-01 14:05:14.915 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:05:15 compute-0 nova_compute[192698]: 2025-10-01 14:05:15.732 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:05:17 compute-0 nova_compute[192698]: 2025-10-01 14:05:17.558 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:05:18 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:05:18.002 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'e2:3f:3c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:1d:a6:67:ed:e6'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:05:18 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:05:18.003 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 01 14:05:18 compute-0 nova_compute[192698]: 2025-10-01 14:05:18.048 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:05:20 compute-0 podman[216711]: 2025-10-01 14:05:20.173596331 +0000 UTC m=+0.081050391 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.openshift.expose-services=, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.tags=minimal rhel9, name=ubi9-minimal, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, version=9.6, architecture=x86_64, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct 01 14:05:20 compute-0 nova_compute[192698]: 2025-10-01 14:05:20.733 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:05:22 compute-0 nova_compute[192698]: 2025-10-01 14:05:22.591 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:05:25 compute-0 nova_compute[192698]: 2025-10-01 14:05:25.738 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:05:26 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:05:26.005 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=10cf9814-09fa-4bad-879a-270f9b64eda3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:05:26 compute-0 podman[216732]: 2025-10-01 14:05:26.157492358 +0000 UTC m=+0.067780593 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.4)
Oct 01 14:05:26 compute-0 podman[216733]: 2025-10-01 14:05:26.203443048 +0000 UTC m=+0.105762952 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true)
Oct 01 14:05:27 compute-0 nova_compute[192698]: 2025-10-01 14:05:27.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:05:29 compute-0 podman[203144]: time="2025-10-01T14:05:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:05:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:05:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20750 "" "Go-http-client/1.1"
Oct 01 14:05:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:05:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3456 "" "Go-http-client/1.1"
Oct 01 14:05:30 compute-0 nova_compute[192698]: 2025-10-01 14:05:30.742 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:05:31 compute-0 openstack_network_exporter[205307]: ERROR   14:05:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:05:31 compute-0 openstack_network_exporter[205307]: ERROR   14:05:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:05:31 compute-0 openstack_network_exporter[205307]: ERROR   14:05:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:05:31 compute-0 openstack_network_exporter[205307]: ERROR   14:05:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:05:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:05:31 compute-0 openstack_network_exporter[205307]: ERROR   14:05:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:05:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:05:32 compute-0 nova_compute[192698]: 2025-10-01 14:05:32.596 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:05:34 compute-0 podman[216770]: 2025-10-01 14:05:34.211063055 +0000 UTC m=+0.107944675 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 01 14:05:35 compute-0 nova_compute[192698]: 2025-10-01 14:05:35.745 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:05:37 compute-0 nova_compute[192698]: 2025-10-01 14:05:37.090 2 DEBUG oslo_concurrency.lockutils [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Acquiring lock "ff5702e3-c6c5-4b82-a9c4-6a06747a4cae" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:05:37 compute-0 nova_compute[192698]: 2025-10-01 14:05:37.090 2 DEBUG oslo_concurrency.lockutils [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "ff5702e3-c6c5-4b82-a9c4-6a06747a4cae" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:05:37 compute-0 nova_compute[192698]: 2025-10-01 14:05:37.598 2 DEBUG nova.compute.manager [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Oct 01 14:05:37 compute-0 nova_compute[192698]: 2025-10-01 14:05:37.603 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:05:38 compute-0 nova_compute[192698]: 2025-10-01 14:05:38.171 2 DEBUG oslo_concurrency.lockutils [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:05:38 compute-0 nova_compute[192698]: 2025-10-01 14:05:38.172 2 DEBUG oslo_concurrency.lockutils [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:05:38 compute-0 nova_compute[192698]: 2025-10-01 14:05:38.181 2 DEBUG nova.virt.hardware [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Oct 01 14:05:38 compute-0 nova_compute[192698]: 2025-10-01 14:05:38.181 2 INFO nova.compute.claims [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] Claim successful on node compute-0.ctlplane.example.com
Oct 01 14:05:39 compute-0 nova_compute[192698]: 2025-10-01 14:05:39.266 2 DEBUG nova.compute.provider_tree [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:05:39 compute-0 sshd-session[216794]: banner exchange: Connection from 93.123.109.214 port 47776: invalid format
Oct 01 14:05:39 compute-0 sshd-session[216795]: banner exchange: Connection from 93.123.109.214 port 47792: invalid format
Oct 01 14:05:39 compute-0 nova_compute[192698]: 2025-10-01 14:05:39.778 2 DEBUG nova.scheduler.client.report [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:05:40 compute-0 nova_compute[192698]: 2025-10-01 14:05:40.293 2 DEBUG oslo_concurrency.lockutils [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.121s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:05:40 compute-0 nova_compute[192698]: 2025-10-01 14:05:40.294 2 DEBUG nova.compute.manager [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Oct 01 14:05:40 compute-0 nova_compute[192698]: 2025-10-01 14:05:40.747 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:05:40 compute-0 nova_compute[192698]: 2025-10-01 14:05:40.808 2 DEBUG nova.compute.manager [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Oct 01 14:05:40 compute-0 nova_compute[192698]: 2025-10-01 14:05:40.808 2 DEBUG nova.network.neutron [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Oct 01 14:05:40 compute-0 nova_compute[192698]: 2025-10-01 14:05:40.809 2 WARNING neutronclient.v2_0.client [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:05:40 compute-0 nova_compute[192698]: 2025-10-01 14:05:40.810 2 WARNING neutronclient.v2_0.client [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:05:41 compute-0 nova_compute[192698]: 2025-10-01 14:05:41.319 2 INFO nova.virt.libvirt.driver [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 01 14:05:41 compute-0 nova_compute[192698]: 2025-10-01 14:05:41.831 2 DEBUG nova.compute.manager [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Oct 01 14:05:42 compute-0 nova_compute[192698]: 2025-10-01 14:05:42.601 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:05:42 compute-0 nova_compute[192698]: 2025-10-01 14:05:42.855 2 DEBUG nova.compute.manager [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Oct 01 14:05:42 compute-0 nova_compute[192698]: 2025-10-01 14:05:42.858 2 DEBUG nova.virt.libvirt.driver [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Oct 01 14:05:42 compute-0 nova_compute[192698]: 2025-10-01 14:05:42.859 2 INFO nova.virt.libvirt.driver [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] Creating image(s)
Oct 01 14:05:42 compute-0 nova_compute[192698]: 2025-10-01 14:05:42.859 2 DEBUG oslo_concurrency.lockutils [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Acquiring lock "/var/lib/nova/instances/ff5702e3-c6c5-4b82-a9c4-6a06747a4cae/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:05:42 compute-0 nova_compute[192698]: 2025-10-01 14:05:42.860 2 DEBUG oslo_concurrency.lockutils [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "/var/lib/nova/instances/ff5702e3-c6c5-4b82-a9c4-6a06747a4cae/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:05:42 compute-0 nova_compute[192698]: 2025-10-01 14:05:42.861 2 DEBUG oslo_concurrency.lockutils [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "/var/lib/nova/instances/ff5702e3-c6c5-4b82-a9c4-6a06747a4cae/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:05:42 compute-0 nova_compute[192698]: 2025-10-01 14:05:42.863 2 DEBUG oslo_utils.imageutils.format_inspector [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:05:42 compute-0 nova_compute[192698]: 2025-10-01 14:05:42.869 2 DEBUG oslo_utils.imageutils.format_inspector [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:05:42 compute-0 nova_compute[192698]: 2025-10-01 14:05:42.874 2 DEBUG oslo_concurrency.processutils [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:05:42 compute-0 nova_compute[192698]: 2025-10-01 14:05:42.929 2 DEBUG oslo_concurrency.processutils [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:05:42 compute-0 nova_compute[192698]: 2025-10-01 14:05:42.930 2 DEBUG oslo_concurrency.lockutils [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Acquiring lock "f477473ce09fdc00484ca839f539813eb2fee546" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:05:42 compute-0 nova_compute[192698]: 2025-10-01 14:05:42.931 2 DEBUG oslo_concurrency.lockutils [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "f477473ce09fdc00484ca839f539813eb2fee546" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:05:42 compute-0 nova_compute[192698]: 2025-10-01 14:05:42.932 2 DEBUG oslo_utils.imageutils.format_inspector [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:05:42 compute-0 nova_compute[192698]: 2025-10-01 14:05:42.939 2 DEBUG oslo_utils.imageutils.format_inspector [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:05:42 compute-0 nova_compute[192698]: 2025-10-01 14:05:42.940 2 DEBUG oslo_concurrency.processutils [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:05:42 compute-0 nova_compute[192698]: 2025-10-01 14:05:42.984 2 DEBUG nova.network.neutron [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] Successfully created port: 9e1db054-d550-4384-9fd6-118c2eea0c89 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Oct 01 14:05:42 compute-0 nova_compute[192698]: 2025-10-01 14:05:42.993 2 DEBUG oslo_concurrency.processutils [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:05:42 compute-0 nova_compute[192698]: 2025-10-01 14:05:42.994 2 DEBUG oslo_concurrency.processutils [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546,backing_fmt=raw /var/lib/nova/instances/ff5702e3-c6c5-4b82-a9c4-6a06747a4cae/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:05:43 compute-0 nova_compute[192698]: 2025-10-01 14:05:43.100 2 DEBUG oslo_concurrency.processutils [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546,backing_fmt=raw /var/lib/nova/instances/ff5702e3-c6c5-4b82-a9c4-6a06747a4cae/disk 1073741824" returned: 0 in 0.106s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:05:43 compute-0 nova_compute[192698]: 2025-10-01 14:05:43.101 2 DEBUG oslo_concurrency.lockutils [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "f477473ce09fdc00484ca839f539813eb2fee546" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.170s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:05:43 compute-0 nova_compute[192698]: 2025-10-01 14:05:43.101 2 DEBUG oslo_concurrency.processutils [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:05:43 compute-0 nova_compute[192698]: 2025-10-01 14:05:43.170 2 DEBUG oslo_concurrency.processutils [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:05:43 compute-0 nova_compute[192698]: 2025-10-01 14:05:43.171 2 DEBUG nova.virt.disk.api [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Checking if we can resize image /var/lib/nova/instances/ff5702e3-c6c5-4b82-a9c4-6a06747a4cae/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 01 14:05:43 compute-0 nova_compute[192698]: 2025-10-01 14:05:43.171 2 DEBUG oslo_concurrency.processutils [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ff5702e3-c6c5-4b82-a9c4-6a06747a4cae/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:05:43 compute-0 podman[216819]: 2025-10-01 14:05:43.191815477 +0000 UTC m=+0.089204487 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 01 14:05:43 compute-0 nova_compute[192698]: 2025-10-01 14:05:43.234 2 DEBUG oslo_concurrency.processutils [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ff5702e3-c6c5-4b82-a9c4-6a06747a4cae/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:05:43 compute-0 nova_compute[192698]: 2025-10-01 14:05:43.235 2 DEBUG nova.virt.disk.api [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Cannot resize image /var/lib/nova/instances/ff5702e3-c6c5-4b82-a9c4-6a06747a4cae/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 01 14:05:43 compute-0 nova_compute[192698]: 2025-10-01 14:05:43.236 2 DEBUG nova.virt.libvirt.driver [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Oct 01 14:05:43 compute-0 nova_compute[192698]: 2025-10-01 14:05:43.238 2 DEBUG nova.virt.libvirt.driver [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] Ensure instance console log exists: /var/lib/nova/instances/ff5702e3-c6c5-4b82-a9c4-6a06747a4cae/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Oct 01 14:05:43 compute-0 podman[216820]: 2025-10-01 14:05:43.238712189 +0000 UTC m=+0.129771527 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930)
Oct 01 14:05:43 compute-0 nova_compute[192698]: 2025-10-01 14:05:43.239 2 DEBUG oslo_concurrency.lockutils [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:05:43 compute-0 nova_compute[192698]: 2025-10-01 14:05:43.239 2 DEBUG oslo_concurrency.lockutils [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:05:43 compute-0 nova_compute[192698]: 2025-10-01 14:05:43.240 2 DEBUG oslo_concurrency.lockutils [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:05:43 compute-0 nova_compute[192698]: 2025-10-01 14:05:43.584 2 DEBUG nova.network.neutron [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] Successfully updated port: 9e1db054-d550-4384-9fd6-118c2eea0c89 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Oct 01 14:05:43 compute-0 nova_compute[192698]: 2025-10-01 14:05:43.649 2 DEBUG nova.compute.manager [req-720aa061-e7c5-427d-b861-aa6e2dbfd6ec req-d399f5a7-6e4f-45e9-901b-730049a8e2d6 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] Received event network-changed-9e1db054-d550-4384-9fd6-118c2eea0c89 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:05:43 compute-0 nova_compute[192698]: 2025-10-01 14:05:43.649 2 DEBUG nova.compute.manager [req-720aa061-e7c5-427d-b861-aa6e2dbfd6ec req-d399f5a7-6e4f-45e9-901b-730049a8e2d6 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] Refreshing instance network info cache due to event network-changed-9e1db054-d550-4384-9fd6-118c2eea0c89. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Oct 01 14:05:43 compute-0 nova_compute[192698]: 2025-10-01 14:05:43.650 2 DEBUG oslo_concurrency.lockutils [req-720aa061-e7c5-427d-b861-aa6e2dbfd6ec req-d399f5a7-6e4f-45e9-901b-730049a8e2d6 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "refresh_cache-ff5702e3-c6c5-4b82-a9c4-6a06747a4cae" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 01 14:05:43 compute-0 nova_compute[192698]: 2025-10-01 14:05:43.650 2 DEBUG oslo_concurrency.lockutils [req-720aa061-e7c5-427d-b861-aa6e2dbfd6ec req-d399f5a7-6e4f-45e9-901b-730049a8e2d6 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquired lock "refresh_cache-ff5702e3-c6c5-4b82-a9c4-6a06747a4cae" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 01 14:05:43 compute-0 nova_compute[192698]: 2025-10-01 14:05:43.651 2 DEBUG nova.network.neutron [req-720aa061-e7c5-427d-b861-aa6e2dbfd6ec req-d399f5a7-6e4f-45e9-901b-730049a8e2d6 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] Refreshing network info cache for port 9e1db054-d550-4384-9fd6-118c2eea0c89 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Oct 01 14:05:44 compute-0 nova_compute[192698]: 2025-10-01 14:05:44.093 2 DEBUG oslo_concurrency.lockutils [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Acquiring lock "refresh_cache-ff5702e3-c6c5-4b82-a9c4-6a06747a4cae" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 01 14:05:44 compute-0 nova_compute[192698]: 2025-10-01 14:05:44.162 2 WARNING neutronclient.v2_0.client [req-720aa061-e7c5-427d-b861-aa6e2dbfd6ec req-d399f5a7-6e4f-45e9-901b-730049a8e2d6 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:05:44 compute-0 nova_compute[192698]: 2025-10-01 14:05:44.299 2 DEBUG nova.network.neutron [req-720aa061-e7c5-427d-b861-aa6e2dbfd6ec req-d399f5a7-6e4f-45e9-901b-730049a8e2d6 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 01 14:05:45 compute-0 nova_compute[192698]: 2025-10-01 14:05:45.082 2 DEBUG nova.network.neutron [req-720aa061-e7c5-427d-b861-aa6e2dbfd6ec req-d399f5a7-6e4f-45e9-901b-730049a8e2d6 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:05:45 compute-0 nova_compute[192698]: 2025-10-01 14:05:45.589 2 DEBUG oslo_concurrency.lockutils [req-720aa061-e7c5-427d-b861-aa6e2dbfd6ec req-d399f5a7-6e4f-45e9-901b-730049a8e2d6 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Releasing lock "refresh_cache-ff5702e3-c6c5-4b82-a9c4-6a06747a4cae" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 01 14:05:45 compute-0 nova_compute[192698]: 2025-10-01 14:05:45.590 2 DEBUG oslo_concurrency.lockutils [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Acquired lock "refresh_cache-ff5702e3-c6c5-4b82-a9c4-6a06747a4cae" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 01 14:05:45 compute-0 nova_compute[192698]: 2025-10-01 14:05:45.590 2 DEBUG nova.network.neutron [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 01 14:05:45 compute-0 nova_compute[192698]: 2025-10-01 14:05:45.751 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:05:46 compute-0 nova_compute[192698]: 2025-10-01 14:05:46.990 2 DEBUG nova.network.neutron [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 01 14:05:47 compute-0 nova_compute[192698]: 2025-10-01 14:05:47.604 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:05:47 compute-0 nova_compute[192698]: 2025-10-01 14:05:47.984 2 WARNING neutronclient.v2_0.client [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:05:48 compute-0 nova_compute[192698]: 2025-10-01 14:05:48.220 2 DEBUG nova.network.neutron [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] Updating instance_info_cache with network_info: [{"id": "9e1db054-d550-4384-9fd6-118c2eea0c89", "address": "fa:16:3e:2e:d6:d3", "network": {"id": "e35f096a-fd75-4d70-ae58-8a76ae666b9d", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1299231587-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b14b3910fae84828afa468e1e645402b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e1db054-d5", "ovs_interfaceid": "9e1db054-d550-4384-9fd6-118c2eea0c89", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:05:48 compute-0 nova_compute[192698]: 2025-10-01 14:05:48.728 2 DEBUG oslo_concurrency.lockutils [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Releasing lock "refresh_cache-ff5702e3-c6c5-4b82-a9c4-6a06747a4cae" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 01 14:05:48 compute-0 nova_compute[192698]: 2025-10-01 14:05:48.729 2 DEBUG nova.compute.manager [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] Instance network_info: |[{"id": "9e1db054-d550-4384-9fd6-118c2eea0c89", "address": "fa:16:3e:2e:d6:d3", "network": {"id": "e35f096a-fd75-4d70-ae58-8a76ae666b9d", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1299231587-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b14b3910fae84828afa468e1e645402b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e1db054-d5", "ovs_interfaceid": "9e1db054-d550-4384-9fd6-118c2eea0c89", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Oct 01 14:05:48 compute-0 nova_compute[192698]: 2025-10-01 14:05:48.733 2 DEBUG nova.virt.libvirt.driver [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] Start _get_guest_xml network_info=[{"id": "9e1db054-d550-4384-9fd6-118c2eea0c89", "address": "fa:16:3e:2e:d6:d3", "network": {"id": "e35f096a-fd75-4d70-ae58-8a76ae666b9d", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1299231587-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b14b3910fae84828afa468e1e645402b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e1db054-d5", "ovs_interfaceid": "9e1db054-d550-4384-9fd6-118c2eea0c89", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-01T13:57:39Z,direct_url=<?>,disk_format='qcow2',id=48696e9b-a20d-4bf6-8ac2-6438fe748ab6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9dacac6049d34f02846f752af09ae16f',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-01T13:57:40Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encrypted': False, 'encryption_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_options': None, 'device_name': '/dev/vda', 'guest_format': None, 'image_id': '48696e9b-a20d-4bf6-8ac2-6438fe748ab6'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Oct 01 14:05:48 compute-0 nova_compute[192698]: 2025-10-01 14:05:48.739 2 WARNING nova.virt.libvirt.driver [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:05:48 compute-0 nova_compute[192698]: 2025-10-01 14:05:48.742 2 DEBUG nova.virt.driver [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='48696e9b-a20d-4bf6-8ac2-6438fe748ab6', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteActionsViaActuator-server-1626632076', uuid='ff5702e3-c6c5-4b82-a9c4-6a06747a4cae'), owner=OwnerMeta(userid='82619989ef1f48a39f1c1e7d64e4cb38', username='tempest-TestExecuteActionsViaActuator-2075848047-project-admin', projectid='67079b4774294271895bbf7b04f602e7', projectname='tempest-TestExecuteActionsViaActuator-2075848047'), image=ImageMeta(id='48696e9b-a20d-4bf6-8ac2-6438fe748ab6', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='69702c4b-38f2-49d1-96d5-85671652c67e', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "9e1db054-d550-4384-9fd6-118c2eea0c89", "address": "fa:16:3e:2e:d6:d3", "network": {"id": "e35f096a-fd75-4d70-ae58-8a76ae666b9d", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1299231587-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b14b3910fae84828afa468e1e645402b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e1db054-d5", "ovs_interfaceid": "9e1db054-d550-4384-9fd6-118c2eea0c89", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759327548.7420018) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Oct 01 14:05:48 compute-0 nova_compute[192698]: 2025-10-01 14:05:48.749 2 DEBUG nova.virt.libvirt.host [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Oct 01 14:05:48 compute-0 nova_compute[192698]: 2025-10-01 14:05:48.750 2 DEBUG nova.virt.libvirt.host [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Oct 01 14:05:48 compute-0 nova_compute[192698]: 2025-10-01 14:05:48.754 2 DEBUG nova.virt.libvirt.host [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Oct 01 14:05:48 compute-0 nova_compute[192698]: 2025-10-01 14:05:48.754 2 DEBUG nova.virt.libvirt.host [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Oct 01 14:05:48 compute-0 nova_compute[192698]: 2025-10-01 14:05:48.755 2 DEBUG nova.virt.libvirt.driver [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Oct 01 14:05:48 compute-0 nova_compute[192698]: 2025-10-01 14:05:48.755 2 DEBUG nova.virt.hardware [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-01T13:57:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='69702c4b-38f2-49d1-96d5-85671652c67e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-01T13:57:39Z,direct_url=<?>,disk_format='qcow2',id=48696e9b-a20d-4bf6-8ac2-6438fe748ab6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9dacac6049d34f02846f752af09ae16f',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-01T13:57:40Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Oct 01 14:05:48 compute-0 nova_compute[192698]: 2025-10-01 14:05:48.756 2 DEBUG nova.virt.hardware [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Oct 01 14:05:48 compute-0 nova_compute[192698]: 2025-10-01 14:05:48.757 2 DEBUG nova.virt.hardware [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Oct 01 14:05:48 compute-0 nova_compute[192698]: 2025-10-01 14:05:48.757 2 DEBUG nova.virt.hardware [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Oct 01 14:05:48 compute-0 nova_compute[192698]: 2025-10-01 14:05:48.757 2 DEBUG nova.virt.hardware [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Oct 01 14:05:48 compute-0 nova_compute[192698]: 2025-10-01 14:05:48.758 2 DEBUG nova.virt.hardware [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Oct 01 14:05:48 compute-0 nova_compute[192698]: 2025-10-01 14:05:48.758 2 DEBUG nova.virt.hardware [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Oct 01 14:05:48 compute-0 nova_compute[192698]: 2025-10-01 14:05:48.759 2 DEBUG nova.virt.hardware [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Oct 01 14:05:48 compute-0 nova_compute[192698]: 2025-10-01 14:05:48.759 2 DEBUG nova.virt.hardware [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Oct 01 14:05:48 compute-0 nova_compute[192698]: 2025-10-01 14:05:48.759 2 DEBUG nova.virt.hardware [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Oct 01 14:05:48 compute-0 nova_compute[192698]: 2025-10-01 14:05:48.760 2 DEBUG nova.virt.hardware [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Oct 01 14:05:48 compute-0 nova_compute[192698]: 2025-10-01 14:05:48.766 2 DEBUG nova.virt.libvirt.vif [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-01T14:05:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1626632076',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1626632076',id=7,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='67079b4774294271895bbf7b04f602e7',ramdisk_id='',reservation_id='r-yli9a99r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-2075848047',owner_user_name='tempest-TestExecuteActionsViaActuator-2075848047-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-01T14:05:41Z,user_data=None,user_id='82619989ef1f48a39f1c1e7d64e4cb38',uuid=ff5702e3-c6c5-4b82-a9c4-6a06747a4cae,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9e1db054-d550-4384-9fd6-118c2eea0c89", "address": "fa:16:3e:2e:d6:d3", "network": {"id": "e35f096a-fd75-4d70-ae58-8a76ae666b9d", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1299231587-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b14b3910fae84828afa468e1e645402b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e1db054-d5", "ovs_interfaceid": "9e1db054-d550-4384-9fd6-118c2eea0c89", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Oct 01 14:05:48 compute-0 nova_compute[192698]: 2025-10-01 14:05:48.767 2 DEBUG nova.network.os_vif_util [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Converting VIF {"id": "9e1db054-d550-4384-9fd6-118c2eea0c89", "address": "fa:16:3e:2e:d6:d3", "network": {"id": "e35f096a-fd75-4d70-ae58-8a76ae666b9d", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1299231587-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b14b3910fae84828afa468e1e645402b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e1db054-d5", "ovs_interfaceid": "9e1db054-d550-4384-9fd6-118c2eea0c89", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:05:48 compute-0 nova_compute[192698]: 2025-10-01 14:05:48.768 2 DEBUG nova.network.os_vif_util [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2e:d6:d3,bridge_name='br-int',has_traffic_filtering=True,id=9e1db054-d550-4384-9fd6-118c2eea0c89,network=Network(e35f096a-fd75-4d70-ae58-8a76ae666b9d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9e1db054-d5') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:05:48 compute-0 nova_compute[192698]: 2025-10-01 14:05:48.770 2 DEBUG nova.objects.instance [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lazy-loading 'pci_devices' on Instance uuid ff5702e3-c6c5-4b82-a9c4-6a06747a4cae obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:05:49 compute-0 nova_compute[192698]: 2025-10-01 14:05:49.279 2 DEBUG nova.virt.libvirt.driver [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] End _get_guest_xml xml=<domain type="kvm">
Oct 01 14:05:49 compute-0 nova_compute[192698]:   <uuid>ff5702e3-c6c5-4b82-a9c4-6a06747a4cae</uuid>
Oct 01 14:05:49 compute-0 nova_compute[192698]:   <name>instance-00000007</name>
Oct 01 14:05:49 compute-0 nova_compute[192698]:   <memory>131072</memory>
Oct 01 14:05:49 compute-0 nova_compute[192698]:   <vcpu>1</vcpu>
Oct 01 14:05:49 compute-0 nova_compute[192698]:   <metadata>
Oct 01 14:05:49 compute-0 nova_compute[192698]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 01 14:05:49 compute-0 nova_compute[192698]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Oct 01 14:05:49 compute-0 nova_compute[192698]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-1626632076</nova:name>
Oct 01 14:05:49 compute-0 nova_compute[192698]:       <nova:creationTime>2025-10-01 14:05:48</nova:creationTime>
Oct 01 14:05:49 compute-0 nova_compute[192698]:       <nova:flavor name="m1.nano" id="69702c4b-38f2-49d1-96d5-85671652c67e">
Oct 01 14:05:49 compute-0 nova_compute[192698]:         <nova:memory>128</nova:memory>
Oct 01 14:05:49 compute-0 nova_compute[192698]:         <nova:disk>1</nova:disk>
Oct 01 14:05:49 compute-0 nova_compute[192698]:         <nova:swap>0</nova:swap>
Oct 01 14:05:49 compute-0 nova_compute[192698]:         <nova:ephemeral>0</nova:ephemeral>
Oct 01 14:05:49 compute-0 nova_compute[192698]:         <nova:vcpus>1</nova:vcpus>
Oct 01 14:05:49 compute-0 nova_compute[192698]:         <nova:extraSpecs>
Oct 01 14:05:49 compute-0 nova_compute[192698]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Oct 01 14:05:49 compute-0 nova_compute[192698]:         </nova:extraSpecs>
Oct 01 14:05:49 compute-0 nova_compute[192698]:       </nova:flavor>
Oct 01 14:05:49 compute-0 nova_compute[192698]:       <nova:image uuid="48696e9b-a20d-4bf6-8ac2-6438fe748ab6">
Oct 01 14:05:49 compute-0 nova_compute[192698]:         <nova:containerFormat>bare</nova:containerFormat>
Oct 01 14:05:49 compute-0 nova_compute[192698]:         <nova:diskFormat>qcow2</nova:diskFormat>
Oct 01 14:05:49 compute-0 nova_compute[192698]:         <nova:minDisk>1</nova:minDisk>
Oct 01 14:05:49 compute-0 nova_compute[192698]:         <nova:minRam>0</nova:minRam>
Oct 01 14:05:49 compute-0 nova_compute[192698]:         <nova:properties>
Oct 01 14:05:49 compute-0 nova_compute[192698]:           <nova:property name="hw_rng_model">virtio</nova:property>
Oct 01 14:05:49 compute-0 nova_compute[192698]:         </nova:properties>
Oct 01 14:05:49 compute-0 nova_compute[192698]:       </nova:image>
Oct 01 14:05:49 compute-0 nova_compute[192698]:       <nova:owner>
Oct 01 14:05:49 compute-0 nova_compute[192698]:         <nova:user uuid="82619989ef1f48a39f1c1e7d64e4cb38">tempest-TestExecuteActionsViaActuator-2075848047-project-admin</nova:user>
Oct 01 14:05:49 compute-0 nova_compute[192698]:         <nova:project uuid="67079b4774294271895bbf7b04f602e7">tempest-TestExecuteActionsViaActuator-2075848047</nova:project>
Oct 01 14:05:49 compute-0 nova_compute[192698]:       </nova:owner>
Oct 01 14:05:49 compute-0 nova_compute[192698]:       <nova:root type="image" uuid="48696e9b-a20d-4bf6-8ac2-6438fe748ab6"/>
Oct 01 14:05:49 compute-0 nova_compute[192698]:       <nova:ports>
Oct 01 14:05:49 compute-0 nova_compute[192698]:         <nova:port uuid="9e1db054-d550-4384-9fd6-118c2eea0c89">
Oct 01 14:05:49 compute-0 nova_compute[192698]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Oct 01 14:05:49 compute-0 nova_compute[192698]:         </nova:port>
Oct 01 14:05:49 compute-0 nova_compute[192698]:       </nova:ports>
Oct 01 14:05:49 compute-0 nova_compute[192698]:     </nova:instance>
Oct 01 14:05:49 compute-0 nova_compute[192698]:   </metadata>
Oct 01 14:05:49 compute-0 nova_compute[192698]:   <sysinfo type="smbios">
Oct 01 14:05:49 compute-0 nova_compute[192698]:     <system>
Oct 01 14:05:49 compute-0 nova_compute[192698]:       <entry name="manufacturer">RDO</entry>
Oct 01 14:05:49 compute-0 nova_compute[192698]:       <entry name="product">OpenStack Compute</entry>
Oct 01 14:05:49 compute-0 nova_compute[192698]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Oct 01 14:05:49 compute-0 nova_compute[192698]:       <entry name="serial">ff5702e3-c6c5-4b82-a9c4-6a06747a4cae</entry>
Oct 01 14:05:49 compute-0 nova_compute[192698]:       <entry name="uuid">ff5702e3-c6c5-4b82-a9c4-6a06747a4cae</entry>
Oct 01 14:05:49 compute-0 nova_compute[192698]:       <entry name="family">Virtual Machine</entry>
Oct 01 14:05:49 compute-0 nova_compute[192698]:     </system>
Oct 01 14:05:49 compute-0 nova_compute[192698]:   </sysinfo>
Oct 01 14:05:49 compute-0 nova_compute[192698]:   <os>
Oct 01 14:05:49 compute-0 nova_compute[192698]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 01 14:05:49 compute-0 nova_compute[192698]:     <boot dev="hd"/>
Oct 01 14:05:49 compute-0 nova_compute[192698]:     <smbios mode="sysinfo"/>
Oct 01 14:05:49 compute-0 nova_compute[192698]:   </os>
Oct 01 14:05:49 compute-0 nova_compute[192698]:   <features>
Oct 01 14:05:49 compute-0 nova_compute[192698]:     <acpi/>
Oct 01 14:05:49 compute-0 nova_compute[192698]:     <apic/>
Oct 01 14:05:49 compute-0 nova_compute[192698]:     <vmcoreinfo/>
Oct 01 14:05:49 compute-0 nova_compute[192698]:   </features>
Oct 01 14:05:49 compute-0 nova_compute[192698]:   <clock offset="utc">
Oct 01 14:05:49 compute-0 nova_compute[192698]:     <timer name="pit" tickpolicy="delay"/>
Oct 01 14:05:49 compute-0 nova_compute[192698]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 01 14:05:49 compute-0 nova_compute[192698]:     <timer name="hpet" present="no"/>
Oct 01 14:05:49 compute-0 nova_compute[192698]:   </clock>
Oct 01 14:05:49 compute-0 nova_compute[192698]:   <cpu mode="host-model" match="exact">
Oct 01 14:05:49 compute-0 nova_compute[192698]:     <topology sockets="1" cores="1" threads="1"/>
Oct 01 14:05:49 compute-0 nova_compute[192698]:   </cpu>
Oct 01 14:05:49 compute-0 nova_compute[192698]:   <devices>
Oct 01 14:05:49 compute-0 nova_compute[192698]:     <disk type="file" device="disk">
Oct 01 14:05:49 compute-0 nova_compute[192698]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 01 14:05:49 compute-0 nova_compute[192698]:       <source file="/var/lib/nova/instances/ff5702e3-c6c5-4b82-a9c4-6a06747a4cae/disk"/>
Oct 01 14:05:49 compute-0 nova_compute[192698]:       <target dev="vda" bus="virtio"/>
Oct 01 14:05:49 compute-0 nova_compute[192698]:     </disk>
Oct 01 14:05:49 compute-0 nova_compute[192698]:     <disk type="file" device="cdrom">
Oct 01 14:05:49 compute-0 nova_compute[192698]:       <driver name="qemu" type="raw" cache="none"/>
Oct 01 14:05:49 compute-0 nova_compute[192698]:       <source file="/var/lib/nova/instances/ff5702e3-c6c5-4b82-a9c4-6a06747a4cae/disk.config"/>
Oct 01 14:05:49 compute-0 nova_compute[192698]:       <target dev="sda" bus="sata"/>
Oct 01 14:05:49 compute-0 nova_compute[192698]:     </disk>
Oct 01 14:05:49 compute-0 nova_compute[192698]:     <interface type="ethernet">
Oct 01 14:05:49 compute-0 nova_compute[192698]:       <mac address="fa:16:3e:2e:d6:d3"/>
Oct 01 14:05:49 compute-0 nova_compute[192698]:       <model type="virtio"/>
Oct 01 14:05:49 compute-0 nova_compute[192698]:       <driver name="vhost" rx_queue_size="512"/>
Oct 01 14:05:49 compute-0 nova_compute[192698]:       <mtu size="1442"/>
Oct 01 14:05:49 compute-0 nova_compute[192698]:       <target dev="tap9e1db054-d5"/>
Oct 01 14:05:49 compute-0 nova_compute[192698]:     </interface>
Oct 01 14:05:49 compute-0 nova_compute[192698]:     <serial type="pty">
Oct 01 14:05:49 compute-0 nova_compute[192698]:       <log file="/var/lib/nova/instances/ff5702e3-c6c5-4b82-a9c4-6a06747a4cae/console.log" append="off"/>
Oct 01 14:05:49 compute-0 nova_compute[192698]:     </serial>
Oct 01 14:05:49 compute-0 nova_compute[192698]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 01 14:05:49 compute-0 nova_compute[192698]:     <video>
Oct 01 14:05:49 compute-0 nova_compute[192698]:       <model type="virtio"/>
Oct 01 14:05:49 compute-0 nova_compute[192698]:     </video>
Oct 01 14:05:49 compute-0 nova_compute[192698]:     <input type="tablet" bus="usb"/>
Oct 01 14:05:49 compute-0 nova_compute[192698]:     <rng model="virtio">
Oct 01 14:05:49 compute-0 nova_compute[192698]:       <backend model="random">/dev/urandom</backend>
Oct 01 14:05:49 compute-0 nova_compute[192698]:     </rng>
Oct 01 14:05:49 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root"/>
Oct 01 14:05:49 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:05:49 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:05:49 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:05:49 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:05:49 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:05:49 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:05:49 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:05:49 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:05:49 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:05:49 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:05:49 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:05:49 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:05:49 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:05:49 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:05:49 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:05:49 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:05:49 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:05:49 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:05:49 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:05:49 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:05:49 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:05:49 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:05:49 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:05:49 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:05:49 compute-0 nova_compute[192698]:     <controller type="usb" index="0"/>
Oct 01 14:05:49 compute-0 nova_compute[192698]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Oct 01 14:05:49 compute-0 nova_compute[192698]:       <stats period="10"/>
Oct 01 14:05:49 compute-0 nova_compute[192698]:     </memballoon>
Oct 01 14:05:49 compute-0 nova_compute[192698]:   </devices>
Oct 01 14:05:49 compute-0 nova_compute[192698]: </domain>
Oct 01 14:05:49 compute-0 nova_compute[192698]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Oct 01 14:05:49 compute-0 nova_compute[192698]: 2025-10-01 14:05:49.281 2 DEBUG nova.compute.manager [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] Preparing to wait for external event network-vif-plugged-9e1db054-d550-4384-9fd6-118c2eea0c89 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Oct 01 14:05:49 compute-0 nova_compute[192698]: 2025-10-01 14:05:49.282 2 DEBUG oslo_concurrency.lockutils [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Acquiring lock "ff5702e3-c6c5-4b82-a9c4-6a06747a4cae-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:05:49 compute-0 nova_compute[192698]: 2025-10-01 14:05:49.282 2 DEBUG oslo_concurrency.lockutils [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "ff5702e3-c6c5-4b82-a9c4-6a06747a4cae-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:05:49 compute-0 nova_compute[192698]: 2025-10-01 14:05:49.283 2 DEBUG oslo_concurrency.lockutils [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "ff5702e3-c6c5-4b82-a9c4-6a06747a4cae-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:05:49 compute-0 nova_compute[192698]: 2025-10-01 14:05:49.284 2 DEBUG nova.virt.libvirt.vif [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-01T14:05:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1626632076',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1626632076',id=7,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='67079b4774294271895bbf7b04f602e7',ramdisk_id='',reservation_id='r-yli9a99r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-2075848047',owner_user_name='tempest-TestExecuteActionsViaActuator-2075848047-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-01T14:05:41Z,user_data=None,user_id='82619989ef1f48a39f1c1e7d64e4cb38',uuid=ff5702e3-c6c5-4b82-a9c4-6a06747a4cae,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9e1db054-d550-4384-9fd6-118c2eea0c89", "address": "fa:16:3e:2e:d6:d3", "network": {"id": "e35f096a-fd75-4d70-ae58-8a76ae666b9d", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1299231587-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b14b3910fae84828afa468e1e645402b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e1db054-d5", "ovs_interfaceid": "9e1db054-d550-4384-9fd6-118c2eea0c89", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 01 14:05:49 compute-0 nova_compute[192698]: 2025-10-01 14:05:49.285 2 DEBUG nova.network.os_vif_util [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Converting VIF {"id": "9e1db054-d550-4384-9fd6-118c2eea0c89", "address": "fa:16:3e:2e:d6:d3", "network": {"id": "e35f096a-fd75-4d70-ae58-8a76ae666b9d", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1299231587-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b14b3910fae84828afa468e1e645402b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e1db054-d5", "ovs_interfaceid": "9e1db054-d550-4384-9fd6-118c2eea0c89", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:05:49 compute-0 nova_compute[192698]: 2025-10-01 14:05:49.286 2 DEBUG nova.network.os_vif_util [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2e:d6:d3,bridge_name='br-int',has_traffic_filtering=True,id=9e1db054-d550-4384-9fd6-118c2eea0c89,network=Network(e35f096a-fd75-4d70-ae58-8a76ae666b9d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9e1db054-d5') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:05:49 compute-0 nova_compute[192698]: 2025-10-01 14:05:49.286 2 DEBUG os_vif [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2e:d6:d3,bridge_name='br-int',has_traffic_filtering=True,id=9e1db054-d550-4384-9fd6-118c2eea0c89,network=Network(e35f096a-fd75-4d70-ae58-8a76ae666b9d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9e1db054-d5') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 01 14:05:49 compute-0 nova_compute[192698]: 2025-10-01 14:05:49.287 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:05:49 compute-0 nova_compute[192698]: 2025-10-01 14:05:49.288 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:05:49 compute-0 nova_compute[192698]: 2025-10-01 14:05:49.288 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:05:49 compute-0 nova_compute[192698]: 2025-10-01 14:05:49.290 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:05:49 compute-0 nova_compute[192698]: 2025-10-01 14:05:49.290 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '9c359440-2b77-516f-a15d-5495d640f7b0', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:05:49 compute-0 nova_compute[192698]: 2025-10-01 14:05:49.292 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:05:49 compute-0 nova_compute[192698]: 2025-10-01 14:05:49.294 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:05:49 compute-0 nova_compute[192698]: 2025-10-01 14:05:49.299 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:05:49 compute-0 nova_compute[192698]: 2025-10-01 14:05:49.300 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9e1db054-d5, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:05:49 compute-0 nova_compute[192698]: 2025-10-01 14:05:49.300 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap9e1db054-d5, col_values=(('qos', UUID('d03bfbf1-5097-42dd-8e4d-97d3e2000f4e')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:05:49 compute-0 nova_compute[192698]: 2025-10-01 14:05:49.301 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap9e1db054-d5, col_values=(('external_ids', {'iface-id': '9e1db054-d550-4384-9fd6-118c2eea0c89', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2e:d6:d3', 'vm-uuid': 'ff5702e3-c6c5-4b82-a9c4-6a06747a4cae'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:05:49 compute-0 nova_compute[192698]: 2025-10-01 14:05:49.303 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:05:49 compute-0 NetworkManager[51741]: <info>  [1759327549.3041] manager: (tap9e1db054-d5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/30)
Oct 01 14:05:49 compute-0 nova_compute[192698]: 2025-10-01 14:05:49.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:05:49 compute-0 nova_compute[192698]: 2025-10-01 14:05:49.314 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:05:49 compute-0 nova_compute[192698]: 2025-10-01 14:05:49.315 2 INFO os_vif [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2e:d6:d3,bridge_name='br-int',has_traffic_filtering=True,id=9e1db054-d550-4384-9fd6-118c2eea0c89,network=Network(e35f096a-fd75-4d70-ae58-8a76ae666b9d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9e1db054-d5')
Oct 01 14:05:50 compute-0 nova_compute[192698]: 2025-10-01 14:05:50.909 2 DEBUG nova.virt.libvirt.driver [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 01 14:05:50 compute-0 nova_compute[192698]: 2025-10-01 14:05:50.909 2 DEBUG nova.virt.libvirt.driver [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 01 14:05:50 compute-0 nova_compute[192698]: 2025-10-01 14:05:50.910 2 DEBUG nova.virt.libvirt.driver [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] No VIF found with MAC fa:16:3e:2e:d6:d3, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Oct 01 14:05:50 compute-0 nova_compute[192698]: 2025-10-01 14:05:50.911 2 INFO nova.virt.libvirt.driver [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] Using config drive
Oct 01 14:05:51 compute-0 podman[216872]: 2025-10-01 14:05:51.200315537 +0000 UTC m=+0.102897785 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, name=ubi9-minimal, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, architecture=x86_64, vcs-type=git, com.redhat.component=ubi9-minimal-container, release=1755695350, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, config_id=edpm, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct 01 14:05:51 compute-0 nova_compute[192698]: 2025-10-01 14:05:51.426 2 WARNING neutronclient.v2_0.client [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:05:52 compute-0 nova_compute[192698]: 2025-10-01 14:05:52.076 2 INFO nova.virt.libvirt.driver [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] Creating config drive at /var/lib/nova/instances/ff5702e3-c6c5-4b82-a9c4-6a06747a4cae/disk.config
Oct 01 14:05:52 compute-0 nova_compute[192698]: 2025-10-01 14:05:52.088 2 DEBUG oslo_concurrency.processutils [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ff5702e3-c6c5-4b82-a9c4-6a06747a4cae/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpy_h7er83 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:05:52 compute-0 nova_compute[192698]: 2025-10-01 14:05:52.232 2 DEBUG oslo_concurrency.processutils [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ff5702e3-c6c5-4b82-a9c4-6a06747a4cae/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpy_h7er83" returned: 0 in 0.144s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:05:52 compute-0 kernel: tap9e1db054-d5: entered promiscuous mode
Oct 01 14:05:52 compute-0 nova_compute[192698]: 2025-10-01 14:05:52.298 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:05:52 compute-0 NetworkManager[51741]: <info>  [1759327552.2989] manager: (tap9e1db054-d5): new Tun device (/org/freedesktop/NetworkManager/Devices/31)
Oct 01 14:05:52 compute-0 ovn_controller[94909]: 2025-10-01T14:05:52Z|00054|binding|INFO|Claiming lport 9e1db054-d550-4384-9fd6-118c2eea0c89 for this chassis.
Oct 01 14:05:52 compute-0 ovn_controller[94909]: 2025-10-01T14:05:52Z|00055|binding|INFO|9e1db054-d550-4384-9fd6-118c2eea0c89: Claiming fa:16:3e:2e:d6:d3 10.100.0.8
Oct 01 14:05:52 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:05:52.310 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2e:d6:d3 10.100.0.8'], port_security=['fa:16:3e:2e:d6:d3 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'ff5702e3-c6c5-4b82-a9c4-6a06747a4cae', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e35f096a-fd75-4d70-ae58-8a76ae666b9d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '67079b4774294271895bbf7b04f602e7', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'de7872a1-1f76-4b0f-8bd9-119520ff7a88', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e3a455d-1f77-441e-b08a-0ec8231910e5, chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], logical_port=9e1db054-d550-4384-9fd6-118c2eea0c89) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:05:52 compute-0 nova_compute[192698]: 2025-10-01 14:05:52.312 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:05:52 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:05:52.312 103791 INFO neutron.agent.ovn.metadata.agent [-] Port 9e1db054-d550-4384-9fd6-118c2eea0c89 in datapath e35f096a-fd75-4d70-ae58-8a76ae666b9d bound to our chassis
Oct 01 14:05:52 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:05:52.314 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e35f096a-fd75-4d70-ae58-8a76ae666b9d
Oct 01 14:05:52 compute-0 ovn_controller[94909]: 2025-10-01T14:05:52Z|00056|binding|INFO|Setting lport 9e1db054-d550-4384-9fd6-118c2eea0c89 ovn-installed in OVS
Oct 01 14:05:52 compute-0 ovn_controller[94909]: 2025-10-01T14:05:52Z|00057|binding|INFO|Setting lport 9e1db054-d550-4384-9fd6-118c2eea0c89 up in Southbound
Oct 01 14:05:52 compute-0 nova_compute[192698]: 2025-10-01 14:05:52.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:05:52 compute-0 nova_compute[192698]: 2025-10-01 14:05:52.317 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:05:52 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:05:52.329 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[1e1c3b78-f7b1-4f61-8db5-55f57b6e05b1]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:05:52 compute-0 systemd-udevd[216912]: Network interface NamePolicy= disabled on kernel command line.
Oct 01 14:05:52 compute-0 systemd-machined[152704]: New machine qemu-3-instance-00000007.
Oct 01 14:05:52 compute-0 systemd[1]: Started Virtual Machine qemu-3-instance-00000007.
Oct 01 14:05:52 compute-0 NetworkManager[51741]: <info>  [1759327552.3566] device (tap9e1db054-d5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 01 14:05:52 compute-0 NetworkManager[51741]: <info>  [1759327552.3593] device (tap9e1db054-d5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 01 14:05:52 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:05:52.372 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[27424d71-4139-42e5-a764-59835a0156c5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:05:52 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:05:52.375 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[61746f07-e806-41a3-b8ec-8106b0362b03]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:05:52 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:05:52.411 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[a6948a30-68b6-41ea-8859-6ce9ce810d79]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:05:52 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:05:52.438 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[5f6ccd0c-f53c-4a63-ab67-0bf2e94d9ce7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape35f096a-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:47:1b:50'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 6, 'rx_bytes': 916, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 6, 'rx_bytes': 916, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 382912, 'reachable_time': 20041, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216923, 'error': None, 'target': 'ovnmeta-e35f096a-fd75-4d70-ae58-8a76ae666b9d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:05:52 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:05:52.458 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[be8d677f-5ee6-4055-8062-5b95f0c56c12]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tape35f096a-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 382931, 'tstamp': 382931}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216926, 'error': None, 'target': 'ovnmeta-e35f096a-fd75-4d70-ae58-8a76ae666b9d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tape35f096a-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 382936, 'tstamp': 382936}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216926, 'error': None, 'target': 'ovnmeta-e35f096a-fd75-4d70-ae58-8a76ae666b9d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:05:52 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:05:52.461 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape35f096a-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:05:52 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:05:52.495 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape35f096a-f0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:05:52 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:05:52.496 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:05:52 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:05:52.497 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape35f096a-f0, col_values=(('external_ids', {'iface-id': '3f9111f1-79b1-4bf1-bb95-d924c71fb42c'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:05:52 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:05:52.497 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:05:52 compute-0 nova_compute[192698]: 2025-10-01 14:05:52.497 2 DEBUG nova.compute.manager [req-ef3bc5d7-46bb-4e87-b1a3-e6bf497f9d6d req-1dc35d51-ff1e-4128-bab7-1984777bb039 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] Received event network-vif-plugged-9e1db054-d550-4384-9fd6-118c2eea0c89 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:05:52 compute-0 nova_compute[192698]: 2025-10-01 14:05:52.498 2 DEBUG oslo_concurrency.lockutils [req-ef3bc5d7-46bb-4e87-b1a3-e6bf497f9d6d req-1dc35d51-ff1e-4128-bab7-1984777bb039 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "ff5702e3-c6c5-4b82-a9c4-6a06747a4cae-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:05:52 compute-0 nova_compute[192698]: 2025-10-01 14:05:52.498 2 DEBUG oslo_concurrency.lockutils [req-ef3bc5d7-46bb-4e87-b1a3-e6bf497f9d6d req-1dc35d51-ff1e-4128-bab7-1984777bb039 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "ff5702e3-c6c5-4b82-a9c4-6a06747a4cae-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:05:52 compute-0 nova_compute[192698]: 2025-10-01 14:05:52.498 2 DEBUG oslo_concurrency.lockutils [req-ef3bc5d7-46bb-4e87-b1a3-e6bf497f9d6d req-1dc35d51-ff1e-4128-bab7-1984777bb039 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "ff5702e3-c6c5-4b82-a9c4-6a06747a4cae-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:05:52 compute-0 nova_compute[192698]: 2025-10-01 14:05:52.499 2 DEBUG nova.compute.manager [req-ef3bc5d7-46bb-4e87-b1a3-e6bf497f9d6d req-1dc35d51-ff1e-4128-bab7-1984777bb039 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] Processing event network-vif-plugged-9e1db054-d550-4384-9fd6-118c2eea0c89 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Oct 01 14:05:52 compute-0 nova_compute[192698]: 2025-10-01 14:05:52.499 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:05:52 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:05:52.500 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[3266b98e-aafe-434d-9502-8893dcd298b9]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-e35f096a-fd75-4d70-ae58-8a76ae666b9d\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/e35f096a-fd75-4d70-ae58-8a76ae666b9d.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID e35f096a-fd75-4d70-ae58-8a76ae666b9d\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:05:52 compute-0 nova_compute[192698]: 2025-10-01 14:05:52.606 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:05:53 compute-0 nova_compute[192698]: 2025-10-01 14:05:53.180 2 DEBUG nova.compute.manager [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Oct 01 14:05:53 compute-0 nova_compute[192698]: 2025-10-01 14:05:53.185 2 DEBUG nova.virt.libvirt.driver [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Oct 01 14:05:53 compute-0 nova_compute[192698]: 2025-10-01 14:05:53.190 2 INFO nova.virt.libvirt.driver [-] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] Instance spawned successfully.
Oct 01 14:05:53 compute-0 nova_compute[192698]: 2025-10-01 14:05:53.190 2 DEBUG nova.virt.libvirt.driver [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Oct 01 14:05:53 compute-0 nova_compute[192698]: 2025-10-01 14:05:53.712 2 DEBUG nova.virt.libvirt.driver [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:05:53 compute-0 nova_compute[192698]: 2025-10-01 14:05:53.713 2 DEBUG nova.virt.libvirt.driver [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:05:53 compute-0 nova_compute[192698]: 2025-10-01 14:05:53.714 2 DEBUG nova.virt.libvirt.driver [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:05:53 compute-0 nova_compute[192698]: 2025-10-01 14:05:53.715 2 DEBUG nova.virt.libvirt.driver [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:05:53 compute-0 nova_compute[192698]: 2025-10-01 14:05:53.716 2 DEBUG nova.virt.libvirt.driver [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:05:53 compute-0 nova_compute[192698]: 2025-10-01 14:05:53.717 2 DEBUG nova.virt.libvirt.driver [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:05:54 compute-0 nova_compute[192698]: 2025-10-01 14:05:54.230 2 INFO nova.compute.manager [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] Took 11.37 seconds to spawn the instance on the hypervisor.
Oct 01 14:05:54 compute-0 nova_compute[192698]: 2025-10-01 14:05:54.231 2 DEBUG nova.compute.manager [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 01 14:05:54 compute-0 nova_compute[192698]: 2025-10-01 14:05:54.303 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:05:54 compute-0 nova_compute[192698]: 2025-10-01 14:05:54.563 2 DEBUG nova.compute.manager [req-1c7dfd93-110b-48f3-aee5-598c746b987e req-f06d4fa7-7110-4f21-ae8f-91bbd27d0a29 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] Received event network-vif-plugged-9e1db054-d550-4384-9fd6-118c2eea0c89 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:05:54 compute-0 nova_compute[192698]: 2025-10-01 14:05:54.565 2 DEBUG oslo_concurrency.lockutils [req-1c7dfd93-110b-48f3-aee5-598c746b987e req-f06d4fa7-7110-4f21-ae8f-91bbd27d0a29 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "ff5702e3-c6c5-4b82-a9c4-6a06747a4cae-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:05:54 compute-0 nova_compute[192698]: 2025-10-01 14:05:54.565 2 DEBUG oslo_concurrency.lockutils [req-1c7dfd93-110b-48f3-aee5-598c746b987e req-f06d4fa7-7110-4f21-ae8f-91bbd27d0a29 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "ff5702e3-c6c5-4b82-a9c4-6a06747a4cae-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:05:54 compute-0 nova_compute[192698]: 2025-10-01 14:05:54.565 2 DEBUG oslo_concurrency.lockutils [req-1c7dfd93-110b-48f3-aee5-598c746b987e req-f06d4fa7-7110-4f21-ae8f-91bbd27d0a29 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "ff5702e3-c6c5-4b82-a9c4-6a06747a4cae-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:05:54 compute-0 nova_compute[192698]: 2025-10-01 14:05:54.566 2 DEBUG nova.compute.manager [req-1c7dfd93-110b-48f3-aee5-598c746b987e req-f06d4fa7-7110-4f21-ae8f-91bbd27d0a29 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] No waiting events found dispatching network-vif-plugged-9e1db054-d550-4384-9fd6-118c2eea0c89 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:05:54 compute-0 nova_compute[192698]: 2025-10-01 14:05:54.566 2 WARNING nova.compute.manager [req-1c7dfd93-110b-48f3-aee5-598c746b987e req-f06d4fa7-7110-4f21-ae8f-91bbd27d0a29 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] Received unexpected event network-vif-plugged-9e1db054-d550-4384-9fd6-118c2eea0c89 for instance with vm_state active and task_state None.
Oct 01 14:05:54 compute-0 nova_compute[192698]: 2025-10-01 14:05:54.789 2 INFO nova.compute.manager [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] Took 16.67 seconds to build instance.
Oct 01 14:05:55 compute-0 nova_compute[192698]: 2025-10-01 14:05:55.296 2 DEBUG oslo_concurrency.lockutils [None req-bf2c9a83-0e6a-45aa-8c07-c857e249d588 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "ff5702e3-c6c5-4b82-a9c4-6a06747a4cae" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.206s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:05:57 compute-0 podman[216934]: 2025-10-01 14:05:57.182898912 +0000 UTC m=+0.089609776 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=iscsid, io.buildah.version=1.41.4)
Oct 01 14:05:57 compute-0 podman[216935]: 2025-10-01 14:05:57.199434007 +0000 UTC m=+0.098181511 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=multipathd)
Oct 01 14:05:57 compute-0 nova_compute[192698]: 2025-10-01 14:05:57.609 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:05:59 compute-0 nova_compute[192698]: 2025-10-01 14:05:59.306 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:05:59 compute-0 podman[203144]: time="2025-10-01T14:05:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:05:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:05:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20750 "" "Go-http-client/1.1"
Oct 01 14:05:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:05:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3464 "" "Go-http-client/1.1"
Oct 01 14:06:01 compute-0 openstack_network_exporter[205307]: ERROR   14:06:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:06:01 compute-0 openstack_network_exporter[205307]: ERROR   14:06:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:06:01 compute-0 openstack_network_exporter[205307]: ERROR   14:06:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:06:01 compute-0 openstack_network_exporter[205307]: ERROR   14:06:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:06:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:06:01 compute-0 openstack_network_exporter[205307]: ERROR   14:06:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:06:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:06:02 compute-0 nova_compute[192698]: 2025-10-01 14:06:02.611 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:06:04 compute-0 nova_compute[192698]: 2025-10-01 14:06:04.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:06:05 compute-0 podman[216984]: 2025-10-01 14:06:05.170872539 +0000 UTC m=+0.077378364 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 01 14:06:05 compute-0 nova_compute[192698]: 2025-10-01 14:06:05.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:06:06 compute-0 ovn_controller[94909]: 2025-10-01T14:06:06Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:2e:d6:d3 10.100.0.8
Oct 01 14:06:06 compute-0 ovn_controller[94909]: 2025-10-01T14:06:06Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:2e:d6:d3 10.100.0.8
Oct 01 14:06:07 compute-0 nova_compute[192698]: 2025-10-01 14:06:07.616 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:06:07 compute-0 nova_compute[192698]: 2025-10-01 14:06:07.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:06:08 compute-0 nova_compute[192698]: 2025-10-01 14:06:08.458 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:06:08 compute-0 nova_compute[192698]: 2025-10-01 14:06:08.459 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:06:08 compute-0 nova_compute[192698]: 2025-10-01 14:06:08.460 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:06:08 compute-0 nova_compute[192698]: 2025-10-01 14:06:08.461 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 01 14:06:09 compute-0 nova_compute[192698]: 2025-10-01 14:06:09.312 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:06:09 compute-0 nova_compute[192698]: 2025-10-01 14:06:09.522 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/28407011-1056-4714-96fc-1e8904bbcf1f/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:06:09 compute-0 nova_compute[192698]: 2025-10-01 14:06:09.612 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/28407011-1056-4714-96fc-1e8904bbcf1f/disk --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:06:09 compute-0 nova_compute[192698]: 2025-10-01 14:06:09.614 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/28407011-1056-4714-96fc-1e8904bbcf1f/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:06:09 compute-0 nova_compute[192698]: 2025-10-01 14:06:09.707 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/28407011-1056-4714-96fc-1e8904bbcf1f/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:06:09 compute-0 nova_compute[192698]: 2025-10-01 14:06:09.720 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ff5702e3-c6c5-4b82-a9c4-6a06747a4cae/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:06:09 compute-0 nova_compute[192698]: 2025-10-01 14:06:09.797 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ff5702e3-c6c5-4b82-a9c4-6a06747a4cae/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:06:09 compute-0 nova_compute[192698]: 2025-10-01 14:06:09.799 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ff5702e3-c6c5-4b82-a9c4-6a06747a4cae/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:06:09 compute-0 nova_compute[192698]: 2025-10-01 14:06:09.887 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ff5702e3-c6c5-4b82-a9c4-6a06747a4cae/disk --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:06:10 compute-0 nova_compute[192698]: 2025-10-01 14:06:10.135 2 WARNING nova.virt.libvirt.driver [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:06:10 compute-0 nova_compute[192698]: 2025-10-01 14:06:10.136 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:06:10 compute-0 nova_compute[192698]: 2025-10-01 14:06:10.180 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.043s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:06:10 compute-0 nova_compute[192698]: 2025-10-01 14:06:10.181 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5489MB free_disk=73.24908065795898GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 01 14:06:10 compute-0 nova_compute[192698]: 2025-10-01 14:06:10.181 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:06:10 compute-0 nova_compute[192698]: 2025-10-01 14:06:10.182 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:06:11 compute-0 nova_compute[192698]: 2025-10-01 14:06:11.265 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Instance 28407011-1056-4714-96fc-1e8904bbcf1f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Oct 01 14:06:11 compute-0 nova_compute[192698]: 2025-10-01 14:06:11.267 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Instance ff5702e3-c6c5-4b82-a9c4-6a06747a4cae actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Oct 01 14:06:11 compute-0 nova_compute[192698]: 2025-10-01 14:06:11.268 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 01 14:06:11 compute-0 nova_compute[192698]: 2025-10-01 14:06:11.268 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:06:10 up  1:05,  0 user,  load average: 0.43, 0.37, 0.50\n', 'num_instances': '2', 'num_vm_active': '2', 'num_task_None': '2', 'num_os_type_None': '2', 'num_proj_67079b4774294271895bbf7b04f602e7': '2', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 01 14:06:11 compute-0 nova_compute[192698]: 2025-10-01 14:06:11.334 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:06:11 compute-0 nova_compute[192698]: 2025-10-01 14:06:11.843 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:06:12 compute-0 nova_compute[192698]: 2025-10-01 14:06:12.357 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 01 14:06:12 compute-0 nova_compute[192698]: 2025-10-01 14:06:12.358 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.177s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:06:12 compute-0 nova_compute[192698]: 2025-10-01 14:06:12.619 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:06:13 compute-0 nova_compute[192698]: 2025-10-01 14:06:13.358 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:06:13 compute-0 nova_compute[192698]: 2025-10-01 14:06:13.359 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:06:13 compute-0 nova_compute[192698]: 2025-10-01 14:06:13.359 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:06:13 compute-0 nova_compute[192698]: 2025-10-01 14:06:13.359 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:06:13 compute-0 nova_compute[192698]: 2025-10-01 14:06:13.359 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:06:13 compute-0 nova_compute[192698]: 2025-10-01 14:06:13.360 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:06:13 compute-0 nova_compute[192698]: 2025-10-01 14:06:13.360 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 01 14:06:14 compute-0 podman[217022]: 2025-10-01 14:06:14.182205742 +0000 UTC m=+0.094314769 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Oct 01 14:06:14 compute-0 podman[217023]: 2025-10-01 14:06:14.233422698 +0000 UTC m=+0.133642030 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 01 14:06:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:06:14.237 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:06:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:06:14.238 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:06:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:06:14.238 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:06:14 compute-0 nova_compute[192698]: 2025-10-01 14:06:14.315 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:06:17 compute-0 nova_compute[192698]: 2025-10-01 14:06:17.622 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:06:19 compute-0 nova_compute[192698]: 2025-10-01 14:06:19.317 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:06:22 compute-0 nova_compute[192698]: 2025-10-01 14:06:22.129 2 DEBUG oslo_concurrency.lockutils [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Acquiring lock "ad84d705-7d86-4faf-a1d4-b099e7b6a80f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:06:22 compute-0 nova_compute[192698]: 2025-10-01 14:06:22.129 2 DEBUG oslo_concurrency.lockutils [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "ad84d705-7d86-4faf-a1d4-b099e7b6a80f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:06:22 compute-0 podman[217067]: 2025-10-01 14:06:22.167290604 +0000 UTC m=+0.071623936 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, version=9.6, io.openshift.expose-services=)
Oct 01 14:06:22 compute-0 nova_compute[192698]: 2025-10-01 14:06:22.635 2 DEBUG nova.compute.manager [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Oct 01 14:06:22 compute-0 nova_compute[192698]: 2025-10-01 14:06:22.648 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:06:23 compute-0 nova_compute[192698]: 2025-10-01 14:06:23.204 2 DEBUG oslo_concurrency.lockutils [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:06:23 compute-0 nova_compute[192698]: 2025-10-01 14:06:23.205 2 DEBUG oslo_concurrency.lockutils [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:06:23 compute-0 nova_compute[192698]: 2025-10-01 14:06:23.213 2 DEBUG nova.virt.hardware [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Oct 01 14:06:23 compute-0 nova_compute[192698]: 2025-10-01 14:06:23.213 2 INFO nova.compute.claims [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] Claim successful on node compute-0.ctlplane.example.com
Oct 01 14:06:24 compute-0 nova_compute[192698]: 2025-10-01 14:06:24.306 2 DEBUG nova.compute.provider_tree [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:06:24 compute-0 nova_compute[192698]: 2025-10-01 14:06:24.320 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:06:24 compute-0 nova_compute[192698]: 2025-10-01 14:06:24.814 2 DEBUG nova.scheduler.client.report [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:06:25 compute-0 nova_compute[192698]: 2025-10-01 14:06:25.332 2 DEBUG oslo_concurrency.lockutils [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.127s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:06:25 compute-0 nova_compute[192698]: 2025-10-01 14:06:25.333 2 DEBUG nova.compute.manager [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Oct 01 14:06:25 compute-0 nova_compute[192698]: 2025-10-01 14:06:25.850 2 DEBUG nova.compute.manager [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Oct 01 14:06:25 compute-0 nova_compute[192698]: 2025-10-01 14:06:25.851 2 DEBUG nova.network.neutron [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Oct 01 14:06:25 compute-0 nova_compute[192698]: 2025-10-01 14:06:25.852 2 WARNING neutronclient.v2_0.client [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:06:25 compute-0 nova_compute[192698]: 2025-10-01 14:06:25.852 2 WARNING neutronclient.v2_0.client [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:06:26 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:06:26.334 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'e2:3f:3c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:1d:a6:67:ed:e6'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:06:26 compute-0 nova_compute[192698]: 2025-10-01 14:06:26.334 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:06:26 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:06:26.335 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 01 14:06:26 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:06:26.337 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=10cf9814-09fa-4bad-879a-270f9b64eda3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:06:26 compute-0 nova_compute[192698]: 2025-10-01 14:06:26.361 2 INFO nova.virt.libvirt.driver [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 01 14:06:26 compute-0 nova_compute[192698]: 2025-10-01 14:06:26.439 2 DEBUG nova.network.neutron [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] Successfully created port: c3ad6fe1-af17-4f0b-86c4-8d4384248932 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Oct 01 14:06:26 compute-0 nova_compute[192698]: 2025-10-01 14:06:26.872 2 DEBUG nova.compute.manager [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Oct 01 14:06:27 compute-0 nova_compute[192698]: 2025-10-01 14:06:27.481 2 DEBUG nova.network.neutron [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] Successfully updated port: c3ad6fe1-af17-4f0b-86c4-8d4384248932 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Oct 01 14:06:27 compute-0 nova_compute[192698]: 2025-10-01 14:06:27.554 2 DEBUG nova.compute.manager [req-92594f1f-3293-427a-8571-c56288c5204f req-3e24dbd9-b418-4330-9508-599ee6fc129f 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] Received event network-changed-c3ad6fe1-af17-4f0b-86c4-8d4384248932 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:06:27 compute-0 nova_compute[192698]: 2025-10-01 14:06:27.555 2 DEBUG nova.compute.manager [req-92594f1f-3293-427a-8571-c56288c5204f req-3e24dbd9-b418-4330-9508-599ee6fc129f 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] Refreshing instance network info cache due to event network-changed-c3ad6fe1-af17-4f0b-86c4-8d4384248932. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Oct 01 14:06:27 compute-0 nova_compute[192698]: 2025-10-01 14:06:27.555 2 DEBUG oslo_concurrency.lockutils [req-92594f1f-3293-427a-8571-c56288c5204f req-3e24dbd9-b418-4330-9508-599ee6fc129f 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "refresh_cache-ad84d705-7d86-4faf-a1d4-b099e7b6a80f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 01 14:06:27 compute-0 nova_compute[192698]: 2025-10-01 14:06:27.556 2 DEBUG oslo_concurrency.lockutils [req-92594f1f-3293-427a-8571-c56288c5204f req-3e24dbd9-b418-4330-9508-599ee6fc129f 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquired lock "refresh_cache-ad84d705-7d86-4faf-a1d4-b099e7b6a80f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 01 14:06:27 compute-0 nova_compute[192698]: 2025-10-01 14:06:27.556 2 DEBUG nova.network.neutron [req-92594f1f-3293-427a-8571-c56288c5204f req-3e24dbd9-b418-4330-9508-599ee6fc129f 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] Refreshing network info cache for port c3ad6fe1-af17-4f0b-86c4-8d4384248932 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Oct 01 14:06:27 compute-0 nova_compute[192698]: 2025-10-01 14:06:27.678 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:06:27 compute-0 nova_compute[192698]: 2025-10-01 14:06:27.919 2 DEBUG nova.compute.manager [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Oct 01 14:06:27 compute-0 nova_compute[192698]: 2025-10-01 14:06:27.921 2 DEBUG nova.virt.libvirt.driver [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Oct 01 14:06:27 compute-0 nova_compute[192698]: 2025-10-01 14:06:27.921 2 INFO nova.virt.libvirt.driver [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] Creating image(s)
Oct 01 14:06:27 compute-0 nova_compute[192698]: 2025-10-01 14:06:27.921 2 DEBUG oslo_concurrency.lockutils [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Acquiring lock "/var/lib/nova/instances/ad84d705-7d86-4faf-a1d4-b099e7b6a80f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:06:27 compute-0 nova_compute[192698]: 2025-10-01 14:06:27.922 2 DEBUG oslo_concurrency.lockutils [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "/var/lib/nova/instances/ad84d705-7d86-4faf-a1d4-b099e7b6a80f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:06:27 compute-0 nova_compute[192698]: 2025-10-01 14:06:27.922 2 DEBUG oslo_concurrency.lockutils [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "/var/lib/nova/instances/ad84d705-7d86-4faf-a1d4-b099e7b6a80f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:06:27 compute-0 nova_compute[192698]: 2025-10-01 14:06:27.923 2 DEBUG oslo_utils.imageutils.format_inspector [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:06:27 compute-0 nova_compute[192698]: 2025-10-01 14:06:27.925 2 DEBUG oslo_utils.imageutils.format_inspector [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:06:27 compute-0 nova_compute[192698]: 2025-10-01 14:06:27.927 2 DEBUG oslo_concurrency.processutils [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:06:27 compute-0 nova_compute[192698]: 2025-10-01 14:06:27.990 2 DEBUG oslo_concurrency.lockutils [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Acquiring lock "refresh_cache-ad84d705-7d86-4faf-a1d4-b099e7b6a80f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 01 14:06:28 compute-0 nova_compute[192698]: 2025-10-01 14:06:28.014 2 DEBUG oslo_concurrency.processutils [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:06:28 compute-0 nova_compute[192698]: 2025-10-01 14:06:28.015 2 DEBUG oslo_concurrency.lockutils [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Acquiring lock "f477473ce09fdc00484ca839f539813eb2fee546" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:06:28 compute-0 nova_compute[192698]: 2025-10-01 14:06:28.016 2 DEBUG oslo_concurrency.lockutils [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "f477473ce09fdc00484ca839f539813eb2fee546" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:06:28 compute-0 nova_compute[192698]: 2025-10-01 14:06:28.017 2 DEBUG oslo_utils.imageutils.format_inspector [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:06:28 compute-0 nova_compute[192698]: 2025-10-01 14:06:28.024 2 DEBUG oslo_utils.imageutils.format_inspector [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:06:28 compute-0 nova_compute[192698]: 2025-10-01 14:06:28.025 2 DEBUG oslo_concurrency.processutils [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:06:28 compute-0 nova_compute[192698]: 2025-10-01 14:06:28.065 2 WARNING neutronclient.v2_0.client [req-92594f1f-3293-427a-8571-c56288c5204f req-3e24dbd9-b418-4330-9508-599ee6fc129f 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:06:28 compute-0 nova_compute[192698]: 2025-10-01 14:06:28.117 2 DEBUG oslo_concurrency.processutils [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:06:28 compute-0 nova_compute[192698]: 2025-10-01 14:06:28.118 2 DEBUG oslo_concurrency.processutils [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546,backing_fmt=raw /var/lib/nova/instances/ad84d705-7d86-4faf-a1d4-b099e7b6a80f/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:06:28 compute-0 podman[217092]: 2025-10-01 14:06:28.189515298 +0000 UTC m=+0.091634324 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, container_name=multipathd)
Oct 01 14:06:28 compute-0 podman[217091]: 2025-10-01 14:06:28.198470643 +0000 UTC m=+0.113684863 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=iscsid, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 01 14:06:28 compute-0 nova_compute[192698]: 2025-10-01 14:06:28.248 2 DEBUG oslo_concurrency.processutils [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546,backing_fmt=raw /var/lib/nova/instances/ad84d705-7d86-4faf-a1d4-b099e7b6a80f/disk 1073741824" returned: 0 in 0.130s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:06:28 compute-0 nova_compute[192698]: 2025-10-01 14:06:28.250 2 DEBUG oslo_concurrency.lockutils [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "f477473ce09fdc00484ca839f539813eb2fee546" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.234s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:06:28 compute-0 nova_compute[192698]: 2025-10-01 14:06:28.250 2 DEBUG oslo_concurrency.processutils [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:06:28 compute-0 nova_compute[192698]: 2025-10-01 14:06:28.339 2 DEBUG oslo_concurrency.processutils [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:06:28 compute-0 nova_compute[192698]: 2025-10-01 14:06:28.340 2 DEBUG nova.virt.disk.api [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Checking if we can resize image /var/lib/nova/instances/ad84d705-7d86-4faf-a1d4-b099e7b6a80f/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 01 14:06:28 compute-0 nova_compute[192698]: 2025-10-01 14:06:28.341 2 DEBUG oslo_concurrency.processutils [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ad84d705-7d86-4faf-a1d4-b099e7b6a80f/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:06:28 compute-0 nova_compute[192698]: 2025-10-01 14:06:28.432 2 DEBUG oslo_concurrency.processutils [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ad84d705-7d86-4faf-a1d4-b099e7b6a80f/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:06:28 compute-0 nova_compute[192698]: 2025-10-01 14:06:28.433 2 DEBUG nova.virt.disk.api [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Cannot resize image /var/lib/nova/instances/ad84d705-7d86-4faf-a1d4-b099e7b6a80f/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 01 14:06:28 compute-0 nova_compute[192698]: 2025-10-01 14:06:28.433 2 DEBUG nova.virt.libvirt.driver [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Oct 01 14:06:28 compute-0 nova_compute[192698]: 2025-10-01 14:06:28.434 2 DEBUG nova.virt.libvirt.driver [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] Ensure instance console log exists: /var/lib/nova/instances/ad84d705-7d86-4faf-a1d4-b099e7b6a80f/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Oct 01 14:06:28 compute-0 nova_compute[192698]: 2025-10-01 14:06:28.434 2 DEBUG oslo_concurrency.lockutils [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:06:28 compute-0 nova_compute[192698]: 2025-10-01 14:06:28.435 2 DEBUG oslo_concurrency.lockutils [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:06:28 compute-0 nova_compute[192698]: 2025-10-01 14:06:28.435 2 DEBUG oslo_concurrency.lockutils [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:06:29 compute-0 nova_compute[192698]: 2025-10-01 14:06:29.019 2 DEBUG nova.network.neutron [req-92594f1f-3293-427a-8571-c56288c5204f req-3e24dbd9-b418-4330-9508-599ee6fc129f 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 01 14:06:29 compute-0 nova_compute[192698]: 2025-10-01 14:06:29.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:06:29 compute-0 podman[203144]: time="2025-10-01T14:06:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:06:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:06:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20750 "" "Go-http-client/1.1"
Oct 01 14:06:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:06:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3465 "" "Go-http-client/1.1"
Oct 01 14:06:29 compute-0 nova_compute[192698]: 2025-10-01 14:06:29.990 2 DEBUG nova.network.neutron [req-92594f1f-3293-427a-8571-c56288c5204f req-3e24dbd9-b418-4330-9508-599ee6fc129f 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:06:30 compute-0 nova_compute[192698]: 2025-10-01 14:06:30.498 2 DEBUG oslo_concurrency.lockutils [req-92594f1f-3293-427a-8571-c56288c5204f req-3e24dbd9-b418-4330-9508-599ee6fc129f 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Releasing lock "refresh_cache-ad84d705-7d86-4faf-a1d4-b099e7b6a80f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 01 14:06:30 compute-0 nova_compute[192698]: 2025-10-01 14:06:30.500 2 DEBUG oslo_concurrency.lockutils [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Acquired lock "refresh_cache-ad84d705-7d86-4faf-a1d4-b099e7b6a80f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 01 14:06:30 compute-0 nova_compute[192698]: 2025-10-01 14:06:30.501 2 DEBUG nova.network.neutron [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 01 14:06:31 compute-0 nova_compute[192698]: 2025-10-01 14:06:31.101 2 DEBUG nova.network.neutron [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 01 14:06:31 compute-0 openstack_network_exporter[205307]: ERROR   14:06:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:06:31 compute-0 openstack_network_exporter[205307]: ERROR   14:06:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:06:31 compute-0 openstack_network_exporter[205307]: ERROR   14:06:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:06:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:06:31 compute-0 openstack_network_exporter[205307]: ERROR   14:06:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:06:31 compute-0 openstack_network_exporter[205307]: ERROR   14:06:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:06:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:06:31 compute-0 nova_compute[192698]: 2025-10-01 14:06:31.999 2 WARNING neutronclient.v2_0.client [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:06:32 compute-0 nova_compute[192698]: 2025-10-01 14:06:32.129 2 DEBUG nova.network.neutron [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] Updating instance_info_cache with network_info: [{"id": "c3ad6fe1-af17-4f0b-86c4-8d4384248932", "address": "fa:16:3e:15:67:38", "network": {"id": "e35f096a-fd75-4d70-ae58-8a76ae666b9d", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1299231587-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b14b3910fae84828afa468e1e645402b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3ad6fe1-af", "ovs_interfaceid": "c3ad6fe1-af17-4f0b-86c4-8d4384248932", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:06:32 compute-0 nova_compute[192698]: 2025-10-01 14:06:32.636 2 DEBUG oslo_concurrency.lockutils [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Releasing lock "refresh_cache-ad84d705-7d86-4faf-a1d4-b099e7b6a80f" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 01 14:06:32 compute-0 nova_compute[192698]: 2025-10-01 14:06:32.637 2 DEBUG nova.compute.manager [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] Instance network_info: |[{"id": "c3ad6fe1-af17-4f0b-86c4-8d4384248932", "address": "fa:16:3e:15:67:38", "network": {"id": "e35f096a-fd75-4d70-ae58-8a76ae666b9d", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1299231587-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b14b3910fae84828afa468e1e645402b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3ad6fe1-af", "ovs_interfaceid": "c3ad6fe1-af17-4f0b-86c4-8d4384248932", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Oct 01 14:06:32 compute-0 nova_compute[192698]: 2025-10-01 14:06:32.641 2 DEBUG nova.virt.libvirt.driver [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] Start _get_guest_xml network_info=[{"id": "c3ad6fe1-af17-4f0b-86c4-8d4384248932", "address": "fa:16:3e:15:67:38", "network": {"id": "e35f096a-fd75-4d70-ae58-8a76ae666b9d", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1299231587-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b14b3910fae84828afa468e1e645402b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3ad6fe1-af", "ovs_interfaceid": "c3ad6fe1-af17-4f0b-86c4-8d4384248932", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-01T13:57:39Z,direct_url=<?>,disk_format='qcow2',id=48696e9b-a20d-4bf6-8ac2-6438fe748ab6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9dacac6049d34f02846f752af09ae16f',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-01T13:57:40Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encrypted': False, 'encryption_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_options': None, 'device_name': '/dev/vda', 'guest_format': None, 'image_id': '48696e9b-a20d-4bf6-8ac2-6438fe748ab6'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Oct 01 14:06:32 compute-0 nova_compute[192698]: 2025-10-01 14:06:32.648 2 WARNING nova.virt.libvirt.driver [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:06:32 compute-0 nova_compute[192698]: 2025-10-01 14:06:32.650 2 DEBUG nova.virt.driver [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='48696e9b-a20d-4bf6-8ac2-6438fe748ab6', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteActionsViaActuator-server-1504825469', uuid='ad84d705-7d86-4faf-a1d4-b099e7b6a80f'), owner=OwnerMeta(userid='82619989ef1f48a39f1c1e7d64e4cb38', username='tempest-TestExecuteActionsViaActuator-2075848047-project-admin', projectid='67079b4774294271895bbf7b04f602e7', projectname='tempest-TestExecuteActionsViaActuator-2075848047'), image=ImageMeta(id='48696e9b-a20d-4bf6-8ac2-6438fe748ab6', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='69702c4b-38f2-49d1-96d5-85671652c67e', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "c3ad6fe1-af17-4f0b-86c4-8d4384248932", "address": "fa:16:3e:15:67:38", "network": {"id": "e35f096a-fd75-4d70-ae58-8a76ae666b9d", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1299231587-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b14b3910fae84828afa468e1e645402b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3ad6fe1-af", "ovs_interfaceid": "c3ad6fe1-af17-4f0b-86c4-8d4384248932", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759327592.6504169) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Oct 01 14:06:32 compute-0 nova_compute[192698]: 2025-10-01 14:06:32.655 2 DEBUG nova.virt.libvirt.host [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Oct 01 14:06:32 compute-0 nova_compute[192698]: 2025-10-01 14:06:32.656 2 DEBUG nova.virt.libvirt.host [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Oct 01 14:06:32 compute-0 nova_compute[192698]: 2025-10-01 14:06:32.661 2 DEBUG nova.virt.libvirt.host [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Oct 01 14:06:32 compute-0 nova_compute[192698]: 2025-10-01 14:06:32.662 2 DEBUG nova.virt.libvirt.host [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Oct 01 14:06:32 compute-0 nova_compute[192698]: 2025-10-01 14:06:32.663 2 DEBUG nova.virt.libvirt.driver [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Oct 01 14:06:32 compute-0 nova_compute[192698]: 2025-10-01 14:06:32.664 2 DEBUG nova.virt.hardware [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-01T13:57:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='69702c4b-38f2-49d1-96d5-85671652c67e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-01T13:57:39Z,direct_url=<?>,disk_format='qcow2',id=48696e9b-a20d-4bf6-8ac2-6438fe748ab6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9dacac6049d34f02846f752af09ae16f',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-01T13:57:40Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Oct 01 14:06:32 compute-0 nova_compute[192698]: 2025-10-01 14:06:32.665 2 DEBUG nova.virt.hardware [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Oct 01 14:06:32 compute-0 nova_compute[192698]: 2025-10-01 14:06:32.665 2 DEBUG nova.virt.hardware [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Oct 01 14:06:32 compute-0 nova_compute[192698]: 2025-10-01 14:06:32.665 2 DEBUG nova.virt.hardware [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Oct 01 14:06:32 compute-0 nova_compute[192698]: 2025-10-01 14:06:32.666 2 DEBUG nova.virt.hardware [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Oct 01 14:06:32 compute-0 nova_compute[192698]: 2025-10-01 14:06:32.666 2 DEBUG nova.virt.hardware [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Oct 01 14:06:32 compute-0 nova_compute[192698]: 2025-10-01 14:06:32.667 2 DEBUG nova.virt.hardware [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Oct 01 14:06:32 compute-0 nova_compute[192698]: 2025-10-01 14:06:32.667 2 DEBUG nova.virt.hardware [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Oct 01 14:06:32 compute-0 nova_compute[192698]: 2025-10-01 14:06:32.667 2 DEBUG nova.virt.hardware [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Oct 01 14:06:32 compute-0 nova_compute[192698]: 2025-10-01 14:06:32.668 2 DEBUG nova.virt.hardware [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Oct 01 14:06:32 compute-0 nova_compute[192698]: 2025-10-01 14:06:32.668 2 DEBUG nova.virt.hardware [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Oct 01 14:06:32 compute-0 nova_compute[192698]: 2025-10-01 14:06:32.676 2 DEBUG nova.virt.libvirt.vif [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-01T14:06:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1504825469',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1504825469',id=9,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='67079b4774294271895bbf7b04f602e7',ramdisk_id='',reservation_id='r-3x429zj8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-2075848047',owner_user_name='tempest-TestExecuteActionsViaActuator-2075848047-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-01T14:06:26Z,user_data=None,user_id='82619989ef1f48a39f1c1e7d64e4cb38',uuid=ad84d705-7d86-4faf-a1d4-b099e7b6a80f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c3ad6fe1-af17-4f0b-86c4-8d4384248932", "address": "fa:16:3e:15:67:38", "network": {"id": "e35f096a-fd75-4d70-ae58-8a76ae666b9d", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1299231587-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b14b3910fae84828afa468e1e645402b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3ad6fe1-af", "ovs_interfaceid": "c3ad6fe1-af17-4f0b-86c4-8d4384248932", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Oct 01 14:06:32 compute-0 nova_compute[192698]: 2025-10-01 14:06:32.677 2 DEBUG nova.network.os_vif_util [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Converting VIF {"id": "c3ad6fe1-af17-4f0b-86c4-8d4384248932", "address": "fa:16:3e:15:67:38", "network": {"id": "e35f096a-fd75-4d70-ae58-8a76ae666b9d", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1299231587-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b14b3910fae84828afa468e1e645402b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3ad6fe1-af", "ovs_interfaceid": "c3ad6fe1-af17-4f0b-86c4-8d4384248932", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:06:32 compute-0 nova_compute[192698]: 2025-10-01 14:06:32.678 2 DEBUG nova.network.os_vif_util [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:15:67:38,bridge_name='br-int',has_traffic_filtering=True,id=c3ad6fe1-af17-4f0b-86c4-8d4384248932,network=Network(e35f096a-fd75-4d70-ae58-8a76ae666b9d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3ad6fe1-af') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:06:32 compute-0 nova_compute[192698]: 2025-10-01 14:06:32.682 2 DEBUG nova.objects.instance [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lazy-loading 'pci_devices' on Instance uuid ad84d705-7d86-4faf-a1d4-b099e7b6a80f obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:06:32 compute-0 nova_compute[192698]: 2025-10-01 14:06:32.684 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:06:33 compute-0 nova_compute[192698]: 2025-10-01 14:06:33.222 2 DEBUG nova.virt.libvirt.driver [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] End _get_guest_xml xml=<domain type="kvm">
Oct 01 14:06:33 compute-0 nova_compute[192698]:   <uuid>ad84d705-7d86-4faf-a1d4-b099e7b6a80f</uuid>
Oct 01 14:06:33 compute-0 nova_compute[192698]:   <name>instance-00000009</name>
Oct 01 14:06:33 compute-0 nova_compute[192698]:   <memory>131072</memory>
Oct 01 14:06:33 compute-0 nova_compute[192698]:   <vcpu>1</vcpu>
Oct 01 14:06:33 compute-0 nova_compute[192698]:   <metadata>
Oct 01 14:06:33 compute-0 nova_compute[192698]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 01 14:06:33 compute-0 nova_compute[192698]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Oct 01 14:06:33 compute-0 nova_compute[192698]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-1504825469</nova:name>
Oct 01 14:06:33 compute-0 nova_compute[192698]:       <nova:creationTime>2025-10-01 14:06:32</nova:creationTime>
Oct 01 14:06:33 compute-0 nova_compute[192698]:       <nova:flavor name="m1.nano" id="69702c4b-38f2-49d1-96d5-85671652c67e">
Oct 01 14:06:33 compute-0 nova_compute[192698]:         <nova:memory>128</nova:memory>
Oct 01 14:06:33 compute-0 nova_compute[192698]:         <nova:disk>1</nova:disk>
Oct 01 14:06:33 compute-0 nova_compute[192698]:         <nova:swap>0</nova:swap>
Oct 01 14:06:33 compute-0 nova_compute[192698]:         <nova:ephemeral>0</nova:ephemeral>
Oct 01 14:06:33 compute-0 nova_compute[192698]:         <nova:vcpus>1</nova:vcpus>
Oct 01 14:06:33 compute-0 nova_compute[192698]:         <nova:extraSpecs>
Oct 01 14:06:33 compute-0 nova_compute[192698]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Oct 01 14:06:33 compute-0 nova_compute[192698]:         </nova:extraSpecs>
Oct 01 14:06:33 compute-0 nova_compute[192698]:       </nova:flavor>
Oct 01 14:06:33 compute-0 nova_compute[192698]:       <nova:image uuid="48696e9b-a20d-4bf6-8ac2-6438fe748ab6">
Oct 01 14:06:33 compute-0 nova_compute[192698]:         <nova:containerFormat>bare</nova:containerFormat>
Oct 01 14:06:33 compute-0 nova_compute[192698]:         <nova:diskFormat>qcow2</nova:diskFormat>
Oct 01 14:06:33 compute-0 nova_compute[192698]:         <nova:minDisk>1</nova:minDisk>
Oct 01 14:06:33 compute-0 nova_compute[192698]:         <nova:minRam>0</nova:minRam>
Oct 01 14:06:33 compute-0 nova_compute[192698]:         <nova:properties>
Oct 01 14:06:33 compute-0 nova_compute[192698]:           <nova:property name="hw_rng_model">virtio</nova:property>
Oct 01 14:06:33 compute-0 nova_compute[192698]:         </nova:properties>
Oct 01 14:06:33 compute-0 nova_compute[192698]:       </nova:image>
Oct 01 14:06:33 compute-0 nova_compute[192698]:       <nova:owner>
Oct 01 14:06:33 compute-0 nova_compute[192698]:         <nova:user uuid="82619989ef1f48a39f1c1e7d64e4cb38">tempest-TestExecuteActionsViaActuator-2075848047-project-admin</nova:user>
Oct 01 14:06:33 compute-0 nova_compute[192698]:         <nova:project uuid="67079b4774294271895bbf7b04f602e7">tempest-TestExecuteActionsViaActuator-2075848047</nova:project>
Oct 01 14:06:33 compute-0 nova_compute[192698]:       </nova:owner>
Oct 01 14:06:33 compute-0 nova_compute[192698]:       <nova:root type="image" uuid="48696e9b-a20d-4bf6-8ac2-6438fe748ab6"/>
Oct 01 14:06:33 compute-0 nova_compute[192698]:       <nova:ports>
Oct 01 14:06:33 compute-0 nova_compute[192698]:         <nova:port uuid="c3ad6fe1-af17-4f0b-86c4-8d4384248932">
Oct 01 14:06:33 compute-0 nova_compute[192698]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Oct 01 14:06:33 compute-0 nova_compute[192698]:         </nova:port>
Oct 01 14:06:33 compute-0 nova_compute[192698]:       </nova:ports>
Oct 01 14:06:33 compute-0 nova_compute[192698]:     </nova:instance>
Oct 01 14:06:33 compute-0 nova_compute[192698]:   </metadata>
Oct 01 14:06:33 compute-0 nova_compute[192698]:   <sysinfo type="smbios">
Oct 01 14:06:33 compute-0 nova_compute[192698]:     <system>
Oct 01 14:06:33 compute-0 nova_compute[192698]:       <entry name="manufacturer">RDO</entry>
Oct 01 14:06:33 compute-0 nova_compute[192698]:       <entry name="product">OpenStack Compute</entry>
Oct 01 14:06:33 compute-0 nova_compute[192698]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Oct 01 14:06:33 compute-0 nova_compute[192698]:       <entry name="serial">ad84d705-7d86-4faf-a1d4-b099e7b6a80f</entry>
Oct 01 14:06:33 compute-0 nova_compute[192698]:       <entry name="uuid">ad84d705-7d86-4faf-a1d4-b099e7b6a80f</entry>
Oct 01 14:06:33 compute-0 nova_compute[192698]:       <entry name="family">Virtual Machine</entry>
Oct 01 14:06:33 compute-0 nova_compute[192698]:     </system>
Oct 01 14:06:33 compute-0 nova_compute[192698]:   </sysinfo>
Oct 01 14:06:33 compute-0 nova_compute[192698]:   <os>
Oct 01 14:06:33 compute-0 nova_compute[192698]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 01 14:06:33 compute-0 nova_compute[192698]:     <boot dev="hd"/>
Oct 01 14:06:33 compute-0 nova_compute[192698]:     <smbios mode="sysinfo"/>
Oct 01 14:06:33 compute-0 nova_compute[192698]:   </os>
Oct 01 14:06:33 compute-0 nova_compute[192698]:   <features>
Oct 01 14:06:33 compute-0 nova_compute[192698]:     <acpi/>
Oct 01 14:06:33 compute-0 nova_compute[192698]:     <apic/>
Oct 01 14:06:33 compute-0 nova_compute[192698]:     <vmcoreinfo/>
Oct 01 14:06:33 compute-0 nova_compute[192698]:   </features>
Oct 01 14:06:33 compute-0 nova_compute[192698]:   <clock offset="utc">
Oct 01 14:06:33 compute-0 nova_compute[192698]:     <timer name="pit" tickpolicy="delay"/>
Oct 01 14:06:33 compute-0 nova_compute[192698]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 01 14:06:33 compute-0 nova_compute[192698]:     <timer name="hpet" present="no"/>
Oct 01 14:06:33 compute-0 nova_compute[192698]:   </clock>
Oct 01 14:06:33 compute-0 nova_compute[192698]:   <cpu mode="host-model" match="exact">
Oct 01 14:06:33 compute-0 nova_compute[192698]:     <topology sockets="1" cores="1" threads="1"/>
Oct 01 14:06:33 compute-0 nova_compute[192698]:   </cpu>
Oct 01 14:06:33 compute-0 nova_compute[192698]:   <devices>
Oct 01 14:06:33 compute-0 nova_compute[192698]:     <disk type="file" device="disk">
Oct 01 14:06:33 compute-0 nova_compute[192698]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 01 14:06:33 compute-0 nova_compute[192698]:       <source file="/var/lib/nova/instances/ad84d705-7d86-4faf-a1d4-b099e7b6a80f/disk"/>
Oct 01 14:06:33 compute-0 nova_compute[192698]:       <target dev="vda" bus="virtio"/>
Oct 01 14:06:33 compute-0 nova_compute[192698]:     </disk>
Oct 01 14:06:33 compute-0 nova_compute[192698]:     <disk type="file" device="cdrom">
Oct 01 14:06:33 compute-0 nova_compute[192698]:       <driver name="qemu" type="raw" cache="none"/>
Oct 01 14:06:33 compute-0 nova_compute[192698]:       <source file="/var/lib/nova/instances/ad84d705-7d86-4faf-a1d4-b099e7b6a80f/disk.config"/>
Oct 01 14:06:33 compute-0 nova_compute[192698]:       <target dev="sda" bus="sata"/>
Oct 01 14:06:33 compute-0 nova_compute[192698]:     </disk>
Oct 01 14:06:33 compute-0 nova_compute[192698]:     <interface type="ethernet">
Oct 01 14:06:33 compute-0 nova_compute[192698]:       <mac address="fa:16:3e:15:67:38"/>
Oct 01 14:06:33 compute-0 nova_compute[192698]:       <model type="virtio"/>
Oct 01 14:06:33 compute-0 nova_compute[192698]:       <driver name="vhost" rx_queue_size="512"/>
Oct 01 14:06:33 compute-0 nova_compute[192698]:       <mtu size="1442"/>
Oct 01 14:06:33 compute-0 nova_compute[192698]:       <target dev="tapc3ad6fe1-af"/>
Oct 01 14:06:33 compute-0 nova_compute[192698]:     </interface>
Oct 01 14:06:33 compute-0 nova_compute[192698]:     <serial type="pty">
Oct 01 14:06:33 compute-0 nova_compute[192698]:       <log file="/var/lib/nova/instances/ad84d705-7d86-4faf-a1d4-b099e7b6a80f/console.log" append="off"/>
Oct 01 14:06:33 compute-0 nova_compute[192698]:     </serial>
Oct 01 14:06:33 compute-0 nova_compute[192698]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 01 14:06:33 compute-0 nova_compute[192698]:     <video>
Oct 01 14:06:33 compute-0 nova_compute[192698]:       <model type="virtio"/>
Oct 01 14:06:33 compute-0 nova_compute[192698]:     </video>
Oct 01 14:06:33 compute-0 nova_compute[192698]:     <input type="tablet" bus="usb"/>
Oct 01 14:06:33 compute-0 nova_compute[192698]:     <rng model="virtio">
Oct 01 14:06:33 compute-0 nova_compute[192698]:       <backend model="random">/dev/urandom</backend>
Oct 01 14:06:33 compute-0 nova_compute[192698]:     </rng>
Oct 01 14:06:33 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root"/>
Oct 01 14:06:33 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:06:33 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:06:33 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:06:33 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:06:33 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:06:33 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:06:33 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:06:33 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:06:33 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:06:33 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:06:33 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:06:33 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:06:33 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:06:33 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:06:33 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:06:33 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:06:33 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:06:33 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:06:33 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:06:33 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:06:33 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:06:33 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:06:33 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:06:33 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:06:33 compute-0 nova_compute[192698]:     <controller type="usb" index="0"/>
Oct 01 14:06:33 compute-0 nova_compute[192698]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Oct 01 14:06:33 compute-0 nova_compute[192698]:       <stats period="10"/>
Oct 01 14:06:33 compute-0 nova_compute[192698]:     </memballoon>
Oct 01 14:06:33 compute-0 nova_compute[192698]:   </devices>
Oct 01 14:06:33 compute-0 nova_compute[192698]: </domain>
Oct 01 14:06:33 compute-0 nova_compute[192698]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Oct 01 14:06:33 compute-0 nova_compute[192698]: 2025-10-01 14:06:33.223 2 DEBUG nova.compute.manager [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] Preparing to wait for external event network-vif-plugged-c3ad6fe1-af17-4f0b-86c4-8d4384248932 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Oct 01 14:06:33 compute-0 nova_compute[192698]: 2025-10-01 14:06:33.223 2 DEBUG oslo_concurrency.lockutils [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Acquiring lock "ad84d705-7d86-4faf-a1d4-b099e7b6a80f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:06:33 compute-0 nova_compute[192698]: 2025-10-01 14:06:33.223 2 DEBUG oslo_concurrency.lockutils [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "ad84d705-7d86-4faf-a1d4-b099e7b6a80f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:06:33 compute-0 nova_compute[192698]: 2025-10-01 14:06:33.224 2 DEBUG oslo_concurrency.lockutils [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "ad84d705-7d86-4faf-a1d4-b099e7b6a80f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:06:33 compute-0 nova_compute[192698]: 2025-10-01 14:06:33.225 2 DEBUG nova.virt.libvirt.vif [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-01T14:06:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1504825469',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1504825469',id=9,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='67079b4774294271895bbf7b04f602e7',ramdisk_id='',reservation_id='r-3x429zj8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-2075848047',owner_user_name='tempest-TestExecuteActionsViaActuator-2075848047-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-01T14:06:26Z,user_data=None,user_id='82619989ef1f48a39f1c1e7d64e4cb38',uuid=ad84d705-7d86-4faf-a1d4-b099e7b6a80f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c3ad6fe1-af17-4f0b-86c4-8d4384248932", "address": "fa:16:3e:15:67:38", "network": {"id": "e35f096a-fd75-4d70-ae58-8a76ae666b9d", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1299231587-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b14b3910fae84828afa468e1e645402b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3ad6fe1-af", "ovs_interfaceid": "c3ad6fe1-af17-4f0b-86c4-8d4384248932", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 01 14:06:33 compute-0 nova_compute[192698]: 2025-10-01 14:06:33.226 2 DEBUG nova.network.os_vif_util [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Converting VIF {"id": "c3ad6fe1-af17-4f0b-86c4-8d4384248932", "address": "fa:16:3e:15:67:38", "network": {"id": "e35f096a-fd75-4d70-ae58-8a76ae666b9d", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1299231587-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b14b3910fae84828afa468e1e645402b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3ad6fe1-af", "ovs_interfaceid": "c3ad6fe1-af17-4f0b-86c4-8d4384248932", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:06:33 compute-0 nova_compute[192698]: 2025-10-01 14:06:33.227 2 DEBUG nova.network.os_vif_util [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:15:67:38,bridge_name='br-int',has_traffic_filtering=True,id=c3ad6fe1-af17-4f0b-86c4-8d4384248932,network=Network(e35f096a-fd75-4d70-ae58-8a76ae666b9d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3ad6fe1-af') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:06:33 compute-0 nova_compute[192698]: 2025-10-01 14:06:33.228 2 DEBUG os_vif [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:15:67:38,bridge_name='br-int',has_traffic_filtering=True,id=c3ad6fe1-af17-4f0b-86c4-8d4384248932,network=Network(e35f096a-fd75-4d70-ae58-8a76ae666b9d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3ad6fe1-af') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 01 14:06:33 compute-0 nova_compute[192698]: 2025-10-01 14:06:33.229 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:06:33 compute-0 nova_compute[192698]: 2025-10-01 14:06:33.229 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:06:33 compute-0 nova_compute[192698]: 2025-10-01 14:06:33.230 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:06:33 compute-0 nova_compute[192698]: 2025-10-01 14:06:33.231 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:06:33 compute-0 nova_compute[192698]: 2025-10-01 14:06:33.232 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '6173aa6a-f8fe-5c77-b95a-33c34adff935', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:06:33 compute-0 nova_compute[192698]: 2025-10-01 14:06:33.234 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:06:33 compute-0 nova_compute[192698]: 2025-10-01 14:06:33.238 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:06:33 compute-0 nova_compute[192698]: 2025-10-01 14:06:33.243 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:06:33 compute-0 nova_compute[192698]: 2025-10-01 14:06:33.244 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc3ad6fe1-af, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:06:33 compute-0 nova_compute[192698]: 2025-10-01 14:06:33.244 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapc3ad6fe1-af, col_values=(('qos', UUID('0ff30eed-2380-4298-8432-18882c47bb3e')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:06:33 compute-0 nova_compute[192698]: 2025-10-01 14:06:33.245 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapc3ad6fe1-af, col_values=(('external_ids', {'iface-id': 'c3ad6fe1-af17-4f0b-86c4-8d4384248932', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:15:67:38', 'vm-uuid': 'ad84d705-7d86-4faf-a1d4-b099e7b6a80f'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:06:33 compute-0 NetworkManager[51741]: <info>  [1759327593.2480] manager: (tapc3ad6fe1-af): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/32)
Oct 01 14:06:33 compute-0 nova_compute[192698]: 2025-10-01 14:06:33.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:06:33 compute-0 nova_compute[192698]: 2025-10-01 14:06:33.258 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:06:33 compute-0 nova_compute[192698]: 2025-10-01 14:06:33.260 2 INFO os_vif [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:15:67:38,bridge_name='br-int',has_traffic_filtering=True,id=c3ad6fe1-af17-4f0b-86c4-8d4384248932,network=Network(e35f096a-fd75-4d70-ae58-8a76ae666b9d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3ad6fe1-af')
Oct 01 14:06:34 compute-0 nova_compute[192698]: 2025-10-01 14:06:34.815 2 DEBUG nova.virt.libvirt.driver [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 01 14:06:34 compute-0 nova_compute[192698]: 2025-10-01 14:06:34.815 2 DEBUG nova.virt.libvirt.driver [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 01 14:06:34 compute-0 nova_compute[192698]: 2025-10-01 14:06:34.816 2 DEBUG nova.virt.libvirt.driver [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] No VIF found with MAC fa:16:3e:15:67:38, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Oct 01 14:06:34 compute-0 nova_compute[192698]: 2025-10-01 14:06:34.816 2 INFO nova.virt.libvirt.driver [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] Using config drive
Oct 01 14:06:35 compute-0 nova_compute[192698]: 2025-10-01 14:06:35.334 2 WARNING neutronclient.v2_0.client [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:06:35 compute-0 nova_compute[192698]: 2025-10-01 14:06:35.505 2 INFO nova.virt.libvirt.driver [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] Creating config drive at /var/lib/nova/instances/ad84d705-7d86-4faf-a1d4-b099e7b6a80f/disk.config
Oct 01 14:06:35 compute-0 nova_compute[192698]: 2025-10-01 14:06:35.515 2 DEBUG oslo_concurrency.processutils [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ad84d705-7d86-4faf-a1d4-b099e7b6a80f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmphlryjojo execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:06:35 compute-0 nova_compute[192698]: 2025-10-01 14:06:35.662 2 DEBUG oslo_concurrency.processutils [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ad84d705-7d86-4faf-a1d4-b099e7b6a80f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmphlryjojo" returned: 0 in 0.146s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:06:35 compute-0 kernel: tapc3ad6fe1-af: entered promiscuous mode
Oct 01 14:06:35 compute-0 NetworkManager[51741]: <info>  [1759327595.7617] manager: (tapc3ad6fe1-af): new Tun device (/org/freedesktop/NetworkManager/Devices/33)
Oct 01 14:06:35 compute-0 ovn_controller[94909]: 2025-10-01T14:06:35Z|00058|binding|INFO|Claiming lport c3ad6fe1-af17-4f0b-86c4-8d4384248932 for this chassis.
Oct 01 14:06:35 compute-0 ovn_controller[94909]: 2025-10-01T14:06:35Z|00059|binding|INFO|c3ad6fe1-af17-4f0b-86c4-8d4384248932: Claiming fa:16:3e:15:67:38 10.100.0.6
Oct 01 14:06:35 compute-0 nova_compute[192698]: 2025-10-01 14:06:35.769 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:06:35 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:06:35.774 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:15:67:38 10.100.0.6'], port_security=['fa:16:3e:15:67:38 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'ad84d705-7d86-4faf-a1d4-b099e7b6a80f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e35f096a-fd75-4d70-ae58-8a76ae666b9d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '67079b4774294271895bbf7b04f602e7', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'de7872a1-1f76-4b0f-8bd9-119520ff7a88', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e3a455d-1f77-441e-b08a-0ec8231910e5, chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], tunnel_key=8, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], logical_port=c3ad6fe1-af17-4f0b-86c4-8d4384248932) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:06:35 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:06:35.775 103791 INFO neutron.agent.ovn.metadata.agent [-] Port c3ad6fe1-af17-4f0b-86c4-8d4384248932 in datapath e35f096a-fd75-4d70-ae58-8a76ae666b9d bound to our chassis
Oct 01 14:06:35 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:06:35.777 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e35f096a-fd75-4d70-ae58-8a76ae666b9d
Oct 01 14:06:35 compute-0 ovn_controller[94909]: 2025-10-01T14:06:35Z|00060|binding|INFO|Setting lport c3ad6fe1-af17-4f0b-86c4-8d4384248932 ovn-installed in OVS
Oct 01 14:06:35 compute-0 ovn_controller[94909]: 2025-10-01T14:06:35Z|00061|binding|INFO|Setting lport c3ad6fe1-af17-4f0b-86c4-8d4384248932 up in Southbound
Oct 01 14:06:35 compute-0 nova_compute[192698]: 2025-10-01 14:06:35.780 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:06:35 compute-0 nova_compute[192698]: 2025-10-01 14:06:35.783 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:06:35 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:06:35.803 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[a5aa2e9f-fc70-481f-ae6c-86210cffcd10]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:06:35 compute-0 systemd-udevd[217175]: Network interface NamePolicy= disabled on kernel command line.
Oct 01 14:06:35 compute-0 NetworkManager[51741]: <info>  [1759327595.8222] device (tapc3ad6fe1-af): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 01 14:06:35 compute-0 NetworkManager[51741]: <info>  [1759327595.8247] device (tapc3ad6fe1-af): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 01 14:06:35 compute-0 systemd-machined[152704]: New machine qemu-4-instance-00000009.
Oct 01 14:06:35 compute-0 systemd[1]: Started Virtual Machine qemu-4-instance-00000009.
Oct 01 14:06:35 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:06:35.856 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[f6c291f6-671e-418d-93ac-5d0645f18979]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:06:35 compute-0 podman[217155]: 2025-10-01 14:06:35.86258249 +0000 UTC m=+0.098758405 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 01 14:06:35 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:06:35.862 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[814e1841-47c6-43d7-a440-2ccd7f92ea4b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:06:35 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:06:35.906 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[c316221c-7aa7-4e1e-92ec-fc48cf39b42a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:06:35 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:06:35.930 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[e68f31ca-3614-4d76-be4f-7dcbb0321779]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape35f096a-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:47:1b:50'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 8, 'rx_bytes': 1000, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 8, 'rx_bytes': 1000, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 382912, 'reachable_time': 20041, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217195, 'error': None, 'target': 'ovnmeta-e35f096a-fd75-4d70-ae58-8a76ae666b9d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:06:35 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:06:35.950 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[01d2cff5-3fcc-4e49-ad99-6a8fe1d65919]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tape35f096a-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 382931, 'tstamp': 382931}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217199, 'error': None, 'target': 'ovnmeta-e35f096a-fd75-4d70-ae58-8a76ae666b9d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tape35f096a-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 382936, 'tstamp': 382936}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217199, 'error': None, 'target': 'ovnmeta-e35f096a-fd75-4d70-ae58-8a76ae666b9d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:06:35 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:06:35.952 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape35f096a-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:06:35 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:06:35.957 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape35f096a-f0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:06:35 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:06:35.957 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:06:35 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:06:35.958 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape35f096a-f0, col_values=(('external_ids', {'iface-id': '3f9111f1-79b1-4bf1-bb95-d924c71fb42c'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:06:35 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:06:35.958 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:06:35 compute-0 nova_compute[192698]: 2025-10-01 14:06:35.958 2 DEBUG nova.compute.manager [req-e291cad8-8900-4371-bf33-ea106e3e4fc3 req-8d9611cd-339a-4a4a-8c5c-20c3ce8941f2 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] Received event network-vif-plugged-c3ad6fe1-af17-4f0b-86c4-8d4384248932 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:06:35 compute-0 nova_compute[192698]: 2025-10-01 14:06:35.958 2 DEBUG oslo_concurrency.lockutils [req-e291cad8-8900-4371-bf33-ea106e3e4fc3 req-8d9611cd-339a-4a4a-8c5c-20c3ce8941f2 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "ad84d705-7d86-4faf-a1d4-b099e7b6a80f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:06:35 compute-0 nova_compute[192698]: 2025-10-01 14:06:35.959 2 DEBUG oslo_concurrency.lockutils [req-e291cad8-8900-4371-bf33-ea106e3e4fc3 req-8d9611cd-339a-4a4a-8c5c-20c3ce8941f2 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "ad84d705-7d86-4faf-a1d4-b099e7b6a80f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:06:35 compute-0 nova_compute[192698]: 2025-10-01 14:06:35.959 2 DEBUG oslo_concurrency.lockutils [req-e291cad8-8900-4371-bf33-ea106e3e4fc3 req-8d9611cd-339a-4a4a-8c5c-20c3ce8941f2 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "ad84d705-7d86-4faf-a1d4-b099e7b6a80f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:06:35 compute-0 nova_compute[192698]: 2025-10-01 14:06:35.959 2 DEBUG nova.compute.manager [req-e291cad8-8900-4371-bf33-ea106e3e4fc3 req-8d9611cd-339a-4a4a-8c5c-20c3ce8941f2 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] Processing event network-vif-plugged-c3ad6fe1-af17-4f0b-86c4-8d4384248932 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Oct 01 14:06:35 compute-0 nova_compute[192698]: 2025-10-01 14:06:35.960 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:06:35 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:06:35.960 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[22b2aa4b-596f-4123-aa16-234902e8d503]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-e35f096a-fd75-4d70-ae58-8a76ae666b9d\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/e35f096a-fd75-4d70-ae58-8a76ae666b9d.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID e35f096a-fd75-4d70-ae58-8a76ae666b9d\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:06:36 compute-0 nova_compute[192698]: 2025-10-01 14:06:36.921 2 DEBUG nova.compute.manager [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Oct 01 14:06:36 compute-0 nova_compute[192698]: 2025-10-01 14:06:36.929 2 DEBUG nova.virt.libvirt.driver [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Oct 01 14:06:36 compute-0 nova_compute[192698]: 2025-10-01 14:06:36.935 2 INFO nova.virt.libvirt.driver [-] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] Instance spawned successfully.
Oct 01 14:06:36 compute-0 nova_compute[192698]: 2025-10-01 14:06:36.935 2 DEBUG nova.virt.libvirt.driver [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Oct 01 14:06:37 compute-0 nova_compute[192698]: 2025-10-01 14:06:37.454 2 DEBUG nova.virt.libvirt.driver [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:06:37 compute-0 nova_compute[192698]: 2025-10-01 14:06:37.455 2 DEBUG nova.virt.libvirt.driver [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:06:37 compute-0 nova_compute[192698]: 2025-10-01 14:06:37.455 2 DEBUG nova.virt.libvirt.driver [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:06:37 compute-0 nova_compute[192698]: 2025-10-01 14:06:37.456 2 DEBUG nova.virt.libvirt.driver [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:06:37 compute-0 nova_compute[192698]: 2025-10-01 14:06:37.456 2 DEBUG nova.virt.libvirt.driver [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:06:37 compute-0 nova_compute[192698]: 2025-10-01 14:06:37.457 2 DEBUG nova.virt.libvirt.driver [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:06:37 compute-0 nova_compute[192698]: 2025-10-01 14:06:37.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:06:37 compute-0 nova_compute[192698]: 2025-10-01 14:06:37.972 2 INFO nova.compute.manager [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] Took 10.05 seconds to spawn the instance on the hypervisor.
Oct 01 14:06:37 compute-0 nova_compute[192698]: 2025-10-01 14:06:37.972 2 DEBUG nova.compute.manager [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 01 14:06:38 compute-0 nova_compute[192698]: 2025-10-01 14:06:38.020 2 DEBUG nova.compute.manager [req-59ca34a3-0cad-4ebd-8aa2-2244df081dd1 req-c225ba09-5077-48cb-88e5-25ed2c6ba875 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] Received event network-vif-plugged-c3ad6fe1-af17-4f0b-86c4-8d4384248932 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:06:38 compute-0 nova_compute[192698]: 2025-10-01 14:06:38.021 2 DEBUG oslo_concurrency.lockutils [req-59ca34a3-0cad-4ebd-8aa2-2244df081dd1 req-c225ba09-5077-48cb-88e5-25ed2c6ba875 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "ad84d705-7d86-4faf-a1d4-b099e7b6a80f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:06:38 compute-0 nova_compute[192698]: 2025-10-01 14:06:38.021 2 DEBUG oslo_concurrency.lockutils [req-59ca34a3-0cad-4ebd-8aa2-2244df081dd1 req-c225ba09-5077-48cb-88e5-25ed2c6ba875 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "ad84d705-7d86-4faf-a1d4-b099e7b6a80f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:06:38 compute-0 nova_compute[192698]: 2025-10-01 14:06:38.021 2 DEBUG oslo_concurrency.lockutils [req-59ca34a3-0cad-4ebd-8aa2-2244df081dd1 req-c225ba09-5077-48cb-88e5-25ed2c6ba875 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "ad84d705-7d86-4faf-a1d4-b099e7b6a80f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:06:38 compute-0 nova_compute[192698]: 2025-10-01 14:06:38.021 2 DEBUG nova.compute.manager [req-59ca34a3-0cad-4ebd-8aa2-2244df081dd1 req-c225ba09-5077-48cb-88e5-25ed2c6ba875 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] No waiting events found dispatching network-vif-plugged-c3ad6fe1-af17-4f0b-86c4-8d4384248932 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:06:38 compute-0 nova_compute[192698]: 2025-10-01 14:06:38.022 2 WARNING nova.compute.manager [req-59ca34a3-0cad-4ebd-8aa2-2244df081dd1 req-c225ba09-5077-48cb-88e5-25ed2c6ba875 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] Received unexpected event network-vif-plugged-c3ad6fe1-af17-4f0b-86c4-8d4384248932 for instance with vm_state building and task_state spawning.
Oct 01 14:06:38 compute-0 nova_compute[192698]: 2025-10-01 14:06:38.246 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:06:38 compute-0 nova_compute[192698]: 2025-10-01 14:06:38.506 2 INFO nova.compute.manager [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] Took 15.36 seconds to build instance.
Oct 01 14:06:39 compute-0 nova_compute[192698]: 2025-10-01 14:06:39.013 2 DEBUG oslo_concurrency.lockutils [None req-70099f5d-fe5e-4dd0-910b-968d8b7cb037 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "ad84d705-7d86-4faf-a1d4-b099e7b6a80f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.883s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:06:42 compute-0 nova_compute[192698]: 2025-10-01 14:06:42.729 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:06:43 compute-0 nova_compute[192698]: 2025-10-01 14:06:43.248 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:06:45 compute-0 podman[217208]: 2025-10-01 14:06:45.21487999 +0000 UTC m=+0.110326033 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Oct 01 14:06:45 compute-0 podman[217209]: 2025-10-01 14:06:45.336450906 +0000 UTC m=+0.225233551 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_controller, managed_by=edpm_ansible, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct 01 14:06:47 compute-0 nova_compute[192698]: 2025-10-01 14:06:47.334 2 DEBUG nova.compute.manager [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: cc19e5cf-bf34-4a91-a2d7-519421be8b85] Stashing vm_state: active _prep_resize /usr/lib/python3.12/site-packages/nova/compute/manager.py:6169
Oct 01 14:06:47 compute-0 nova_compute[192698]: 2025-10-01 14:06:47.732 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:06:47 compute-0 nova_compute[192698]: 2025-10-01 14:06:47.871 2 DEBUG oslo_concurrency.lockutils [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:06:47 compute-0 nova_compute[192698]: 2025-10-01 14:06:47.872 2 DEBUG oslo_concurrency.lockutils [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:06:48 compute-0 nova_compute[192698]: 2025-10-01 14:06:48.252 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:06:48 compute-0 nova_compute[192698]: 2025-10-01 14:06:48.385 2 DEBUG nova.objects.instance [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lazy-loading 'pci_requests' on Instance uuid cc19e5cf-bf34-4a91-a2d7-519421be8b85 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:06:48 compute-0 ovn_controller[94909]: 2025-10-01T14:06:48Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:15:67:38 10.100.0.6
Oct 01 14:06:48 compute-0 ovn_controller[94909]: 2025-10-01T14:06:48Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:15:67:38 10.100.0.6
Oct 01 14:06:48 compute-0 nova_compute[192698]: 2025-10-01 14:06:48.663 2 DEBUG nova.virt.libvirt.driver [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: c5470fba-81f4-4592-8b40-1027a4dc1c83] Creating tmpfile /var/lib/nova/instances/tmpkspouht5 to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Oct 01 14:06:48 compute-0 nova_compute[192698]: 2025-10-01 14:06:48.665 2 WARNING neutronclient.v2_0.client [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:06:48 compute-0 nova_compute[192698]: 2025-10-01 14:06:48.768 2 DEBUG nova.compute.manager [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpkspouht5',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Oct 01 14:06:48 compute-0 nova_compute[192698]: 2025-10-01 14:06:48.787 2 DEBUG oslo_concurrency.lockutils [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 01 14:06:48 compute-0 nova_compute[192698]: 2025-10-01 14:06:48.788 2 DEBUG oslo_concurrency.lockutils [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 01 14:06:48 compute-0 nova_compute[192698]: 2025-10-01 14:06:48.900 2 DEBUG nova.virt.hardware [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Oct 01 14:06:48 compute-0 nova_compute[192698]: 2025-10-01 14:06:48.900 2 INFO nova.compute.claims [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: cc19e5cf-bf34-4a91-a2d7-519421be8b85] Claim successful on node compute-0.ctlplane.example.com
Oct 01 14:06:48 compute-0 nova_compute[192698]: 2025-10-01 14:06:48.901 2 DEBUG nova.objects.instance [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lazy-loading 'resources' on Instance uuid cc19e5cf-bf34-4a91-a2d7-519421be8b85 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:06:49 compute-0 nova_compute[192698]: 2025-10-01 14:06:49.297 2 INFO nova.compute.rpcapi [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Automatically selected compute RPC version 6.4 from minimum service version 70
Oct 01 14:06:49 compute-0 nova_compute[192698]: 2025-10-01 14:06:49.298 2 DEBUG oslo_concurrency.lockutils [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 01 14:06:49 compute-0 nova_compute[192698]: 2025-10-01 14:06:49.409 2 DEBUG nova.objects.base [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Object Instance<cc19e5cf-bf34-4a91-a2d7-519421be8b85> lazy-loaded attributes: pci_requests,resources wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Oct 01 14:06:49 compute-0 nova_compute[192698]: 2025-10-01 14:06:49.410 2 DEBUG nova.objects.instance [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lazy-loading 'numa_topology' on Instance uuid cc19e5cf-bf34-4a91-a2d7-519421be8b85 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:06:49 compute-0 nova_compute[192698]: 2025-10-01 14:06:49.918 2 DEBUG nova.objects.base [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Object Instance<cc19e5cf-bf34-4a91-a2d7-519421be8b85> lazy-loaded attributes: pci_requests,resources,numa_topology wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Oct 01 14:06:49 compute-0 nova_compute[192698]: 2025-10-01 14:06:49.919 2 DEBUG nova.objects.instance [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lazy-loading 'pci_devices' on Instance uuid cc19e5cf-bf34-4a91-a2d7-519421be8b85 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:06:50 compute-0 nova_compute[192698]: 2025-10-01 14:06:50.426 2 DEBUG nova.objects.base [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Object Instance<cc19e5cf-bf34-4a91-a2d7-519421be8b85> lazy-loaded attributes: pci_requests,resources,numa_topology,pci_devices wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Oct 01 14:06:50 compute-0 nova_compute[192698]: 2025-10-01 14:06:50.939 2 INFO nova.compute.resource_tracker [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: cc19e5cf-bf34-4a91-a2d7-519421be8b85] Updating resource usage from migration 8b717a78-b855-476a-815f-00c4f6e57d6f
Oct 01 14:06:50 compute-0 nova_compute[192698]: 2025-10-01 14:06:50.940 2 DEBUG nova.compute.resource_tracker [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: cc19e5cf-bf34-4a91-a2d7-519421be8b85] Starting to track incoming migration 8b717a78-b855-476a-815f-00c4f6e57d6f with flavor 69702c4b-38f2-49d1-96d5-85671652c67e _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Oct 01 14:06:51 compute-0 nova_compute[192698]: 2025-10-01 14:06:51.313 2 WARNING neutronclient.v2_0.client [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:06:51 compute-0 nova_compute[192698]: 2025-10-01 14:06:51.582 2 DEBUG nova.compute.provider_tree [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:06:52 compute-0 nova_compute[192698]: 2025-10-01 14:06:52.092 2 DEBUG nova.scheduler.client.report [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:06:52 compute-0 nova_compute[192698]: 2025-10-01 14:06:52.606 2 DEBUG oslo_concurrency.lockutils [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 4.734s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:06:52 compute-0 nova_compute[192698]: 2025-10-01 14:06:52.607 2 INFO nova.compute.manager [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: cc19e5cf-bf34-4a91-a2d7-519421be8b85] Migrating
Oct 01 14:06:52 compute-0 nova_compute[192698]: 2025-10-01 14:06:52.734 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:06:53 compute-0 podman[217269]: 2025-10-01 14:06:53.200891003 +0000 UTC m=+0.101602490 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, io.openshift.expose-services=, architecture=x86_64, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, vcs-type=git, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7)
Oct 01 14:06:53 compute-0 nova_compute[192698]: 2025-10-01 14:06:53.254 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:06:55 compute-0 nova_compute[192698]: 2025-10-01 14:06:55.286 2 DEBUG nova.compute.manager [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpkspouht5',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='c5470fba-81f4-4592-8b40-1027a4dc1c83',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Oct 01 14:06:56 compute-0 nova_compute[192698]: 2025-10-01 14:06:56.303 2 DEBUG oslo_concurrency.lockutils [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "refresh_cache-c5470fba-81f4-4592-8b40-1027a4dc1c83" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 01 14:06:56 compute-0 nova_compute[192698]: 2025-10-01 14:06:56.304 2 DEBUG oslo_concurrency.lockutils [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquired lock "refresh_cache-c5470fba-81f4-4592-8b40-1027a4dc1c83" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 01 14:06:56 compute-0 nova_compute[192698]: 2025-10-01 14:06:56.304 2 DEBUG nova.network.neutron [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: c5470fba-81f4-4592-8b40-1027a4dc1c83] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 01 14:06:56 compute-0 nova_compute[192698]: 2025-10-01 14:06:56.810 2 WARNING neutronclient.v2_0.client [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:06:57 compute-0 sshd-session[217291]: Accepted publickey for nova from 192.168.122.101 port 49470 ssh2: ECDSA SHA256:CTpwsitmS+rJgEXvcdw8I+MI7CMoaMbIOq3Cw9WFkYA
Oct 01 14:06:57 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Oct 01 14:06:57 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Oct 01 14:06:57 compute-0 systemd-logind[791]: New session 29 of user nova.
Oct 01 14:06:57 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Oct 01 14:06:57 compute-0 systemd[1]: Starting User Manager for UID 42436...
Oct 01 14:06:57 compute-0 systemd[217295]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Oct 01 14:06:57 compute-0 systemd[217295]: Queued start job for default target Main User Target.
Oct 01 14:06:57 compute-0 systemd[217295]: Created slice User Application Slice.
Oct 01 14:06:57 compute-0 systemd[217295]: Started Mark boot as successful after the user session has run 2 minutes.
Oct 01 14:06:57 compute-0 systemd[217295]: Started Daily Cleanup of User's Temporary Directories.
Oct 01 14:06:57 compute-0 systemd[217295]: Reached target Paths.
Oct 01 14:06:57 compute-0 systemd[217295]: Reached target Timers.
Oct 01 14:06:57 compute-0 systemd[217295]: Starting D-Bus User Message Bus Socket...
Oct 01 14:06:57 compute-0 systemd[217295]: Starting Create User's Volatile Files and Directories...
Oct 01 14:06:57 compute-0 systemd[217295]: Finished Create User's Volatile Files and Directories.
Oct 01 14:06:57 compute-0 systemd[217295]: Listening on D-Bus User Message Bus Socket.
Oct 01 14:06:57 compute-0 systemd[217295]: Reached target Sockets.
Oct 01 14:06:57 compute-0 systemd[217295]: Reached target Basic System.
Oct 01 14:06:57 compute-0 systemd[217295]: Reached target Main User Target.
Oct 01 14:06:57 compute-0 systemd[217295]: Startup finished in 160ms.
Oct 01 14:06:57 compute-0 systemd[1]: Started User Manager for UID 42436.
Oct 01 14:06:57 compute-0 systemd[1]: Started Session 29 of User nova.
Oct 01 14:06:57 compute-0 sshd-session[217291]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Oct 01 14:06:57 compute-0 sshd-session[217311]: Received disconnect from 192.168.122.101 port 49470:11: disconnected by user
Oct 01 14:06:57 compute-0 sshd-session[217311]: Disconnected from user nova 192.168.122.101 port 49470
Oct 01 14:06:57 compute-0 sshd-session[217291]: pam_unix(sshd:session): session closed for user nova
Oct 01 14:06:57 compute-0 systemd[1]: session-29.scope: Deactivated successfully.
Oct 01 14:06:57 compute-0 systemd-logind[791]: Session 29 logged out. Waiting for processes to exit.
Oct 01 14:06:57 compute-0 systemd-logind[791]: Removed session 29.
Oct 01 14:06:57 compute-0 nova_compute[192698]: 2025-10-01 14:06:57.738 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:06:57 compute-0 sshd-session[217313]: Accepted publickey for nova from 192.168.122.101 port 49486 ssh2: ECDSA SHA256:CTpwsitmS+rJgEXvcdw8I+MI7CMoaMbIOq3Cw9WFkYA
Oct 01 14:06:57 compute-0 systemd-logind[791]: New session 31 of user nova.
Oct 01 14:06:57 compute-0 systemd[1]: Started Session 31 of User nova.
Oct 01 14:06:57 compute-0 sshd-session[217313]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Oct 01 14:06:57 compute-0 sshd-session[217316]: Received disconnect from 192.168.122.101 port 49486:11: disconnected by user
Oct 01 14:06:57 compute-0 sshd-session[217316]: Disconnected from user nova 192.168.122.101 port 49486
Oct 01 14:06:57 compute-0 sshd-session[217313]: pam_unix(sshd:session): session closed for user nova
Oct 01 14:06:57 compute-0 systemd[1]: session-31.scope: Deactivated successfully.
Oct 01 14:06:57 compute-0 systemd-logind[791]: Session 31 logged out. Waiting for processes to exit.
Oct 01 14:06:57 compute-0 systemd-logind[791]: Removed session 31.
Oct 01 14:06:58 compute-0 nova_compute[192698]: 2025-10-01 14:06:58.129 2 WARNING neutronclient.v2_0.client [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:06:58 compute-0 nova_compute[192698]: 2025-10-01 14:06:58.257 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:06:59 compute-0 nova_compute[192698]: 2025-10-01 14:06:59.024 2 DEBUG nova.network.neutron [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: c5470fba-81f4-4592-8b40-1027a4dc1c83] Updating instance_info_cache with network_info: [{"id": "bae9bb47-22fa-49ee-9b7e-fc3a13b33880", "address": "fa:16:3e:1d:07:29", "network": {"id": "e35f096a-fd75-4d70-ae58-8a76ae666b9d", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1299231587-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b14b3910fae84828afa468e1e645402b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbae9bb47-22", "ovs_interfaceid": "bae9bb47-22fa-49ee-9b7e-fc3a13b33880", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:06:59 compute-0 podman[217319]: 2025-10-01 14:06:59.210081821 +0000 UTC m=+0.102529195 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct 01 14:06:59 compute-0 podman[217318]: 2025-10-01 14:06:59.211115148 +0000 UTC m=+0.106179612 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=iscsid, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct 01 14:06:59 compute-0 nova_compute[192698]: 2025-10-01 14:06:59.533 2 DEBUG oslo_concurrency.lockutils [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Releasing lock "refresh_cache-c5470fba-81f4-4592-8b40-1027a4dc1c83" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 01 14:06:59 compute-0 nova_compute[192698]: 2025-10-01 14:06:59.546 2 DEBUG nova.virt.libvirt.driver [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: c5470fba-81f4-4592-8b40-1027a4dc1c83] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpkspouht5',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='c5470fba-81f4-4592-8b40-1027a4dc1c83',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Oct 01 14:06:59 compute-0 nova_compute[192698]: 2025-10-01 14:06:59.546 2 DEBUG nova.virt.libvirt.driver [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: c5470fba-81f4-4592-8b40-1027a4dc1c83] Creating instance directory: /var/lib/nova/instances/c5470fba-81f4-4592-8b40-1027a4dc1c83 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Oct 01 14:06:59 compute-0 nova_compute[192698]: 2025-10-01 14:06:59.547 2 DEBUG nova.virt.libvirt.driver [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: c5470fba-81f4-4592-8b40-1027a4dc1c83] Creating disk.info with the contents: {'/var/lib/nova/instances/c5470fba-81f4-4592-8b40-1027a4dc1c83/disk': 'qcow2', '/var/lib/nova/instances/c5470fba-81f4-4592-8b40-1027a4dc1c83/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Oct 01 14:06:59 compute-0 nova_compute[192698]: 2025-10-01 14:06:59.547 2 DEBUG nova.virt.libvirt.driver [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: c5470fba-81f4-4592-8b40-1027a4dc1c83] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Oct 01 14:06:59 compute-0 nova_compute[192698]: 2025-10-01 14:06:59.548 2 DEBUG nova.objects.instance [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lazy-loading 'trusted_certs' on Instance uuid c5470fba-81f4-4592-8b40-1027a4dc1c83 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:06:59 compute-0 podman[203144]: time="2025-10-01T14:06:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:06:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:06:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20750 "" "Go-http-client/1.1"
Oct 01 14:06:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:06:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3458 "" "Go-http-client/1.1"
Oct 01 14:07:00 compute-0 nova_compute[192698]: 2025-10-01 14:07:00.056 2 DEBUG oslo_utils.imageutils.format_inspector [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:07:00 compute-0 nova_compute[192698]: 2025-10-01 14:07:00.061 2 DEBUG oslo_utils.imageutils.format_inspector [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:07:00 compute-0 nova_compute[192698]: 2025-10-01 14:07:00.063 2 DEBUG oslo_concurrency.processutils [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:07:00 compute-0 nova_compute[192698]: 2025-10-01 14:07:00.167 2 DEBUG oslo_concurrency.processutils [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.104s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:07:00 compute-0 nova_compute[192698]: 2025-10-01 14:07:00.169 2 DEBUG oslo_concurrency.lockutils [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "f477473ce09fdc00484ca839f539813eb2fee546" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:07:00 compute-0 nova_compute[192698]: 2025-10-01 14:07:00.170 2 DEBUG oslo_concurrency.lockutils [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "f477473ce09fdc00484ca839f539813eb2fee546" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:07:00 compute-0 nova_compute[192698]: 2025-10-01 14:07:00.172 2 DEBUG oslo_utils.imageutils.format_inspector [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:07:00 compute-0 nova_compute[192698]: 2025-10-01 14:07:00.181 2 DEBUG oslo_utils.imageutils.format_inspector [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:07:00 compute-0 nova_compute[192698]: 2025-10-01 14:07:00.182 2 DEBUG oslo_concurrency.processutils [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:07:00 compute-0 nova_compute[192698]: 2025-10-01 14:07:00.264 2 DEBUG oslo_concurrency.processutils [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:07:00 compute-0 nova_compute[192698]: 2025-10-01 14:07:00.266 2 DEBUG oslo_concurrency.processutils [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546,backing_fmt=raw /var/lib/nova/instances/c5470fba-81f4-4592-8b40-1027a4dc1c83/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:07:00 compute-0 nova_compute[192698]: 2025-10-01 14:07:00.318 2 DEBUG oslo_concurrency.processutils [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546,backing_fmt=raw /var/lib/nova/instances/c5470fba-81f4-4592-8b40-1027a4dc1c83/disk 1073741824" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:07:00 compute-0 nova_compute[192698]: 2025-10-01 14:07:00.320 2 DEBUG oslo_concurrency.lockutils [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "f477473ce09fdc00484ca839f539813eb2fee546" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.149s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:07:00 compute-0 nova_compute[192698]: 2025-10-01 14:07:00.322 2 DEBUG oslo_concurrency.processutils [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:07:00 compute-0 nova_compute[192698]: 2025-10-01 14:07:00.347 2 DEBUG nova.compute.manager [req-f78a458a-f304-46f0-aa62-eebffe933bb3 req-b813c9be-1e6c-424b-8653-d56dea5c1e69 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: cc19e5cf-bf34-4a91-a2d7-519421be8b85] Received event network-vif-unplugged-9eb8c749-13c4-43a1-8edd-b95f31dba7be external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:07:00 compute-0 nova_compute[192698]: 2025-10-01 14:07:00.349 2 DEBUG oslo_concurrency.lockutils [req-f78a458a-f304-46f0-aa62-eebffe933bb3 req-b813c9be-1e6c-424b-8653-d56dea5c1e69 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "cc19e5cf-bf34-4a91-a2d7-519421be8b85-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:07:00 compute-0 nova_compute[192698]: 2025-10-01 14:07:00.350 2 DEBUG oslo_concurrency.lockutils [req-f78a458a-f304-46f0-aa62-eebffe933bb3 req-b813c9be-1e6c-424b-8653-d56dea5c1e69 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "cc19e5cf-bf34-4a91-a2d7-519421be8b85-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:07:00 compute-0 nova_compute[192698]: 2025-10-01 14:07:00.350 2 DEBUG oslo_concurrency.lockutils [req-f78a458a-f304-46f0-aa62-eebffe933bb3 req-b813c9be-1e6c-424b-8653-d56dea5c1e69 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "cc19e5cf-bf34-4a91-a2d7-519421be8b85-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:07:00 compute-0 nova_compute[192698]: 2025-10-01 14:07:00.351 2 DEBUG nova.compute.manager [req-f78a458a-f304-46f0-aa62-eebffe933bb3 req-b813c9be-1e6c-424b-8653-d56dea5c1e69 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: cc19e5cf-bf34-4a91-a2d7-519421be8b85] No waiting events found dispatching network-vif-unplugged-9eb8c749-13c4-43a1-8edd-b95f31dba7be pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:07:00 compute-0 nova_compute[192698]: 2025-10-01 14:07:00.351 2 WARNING nova.compute.manager [req-f78a458a-f304-46f0-aa62-eebffe933bb3 req-b813c9be-1e6c-424b-8653-d56dea5c1e69 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: cc19e5cf-bf34-4a91-a2d7-519421be8b85] Received unexpected event network-vif-unplugged-9eb8c749-13c4-43a1-8edd-b95f31dba7be for instance with vm_state active and task_state resize_migrating.
Oct 01 14:07:00 compute-0 nova_compute[192698]: 2025-10-01 14:07:00.397 2 DEBUG oslo_concurrency.processutils [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:07:00 compute-0 nova_compute[192698]: 2025-10-01 14:07:00.399 2 DEBUG nova.virt.disk.api [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Checking if we can resize image /var/lib/nova/instances/c5470fba-81f4-4592-8b40-1027a4dc1c83/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 01 14:07:00 compute-0 nova_compute[192698]: 2025-10-01 14:07:00.399 2 DEBUG oslo_concurrency.processutils [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c5470fba-81f4-4592-8b40-1027a4dc1c83/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:07:00 compute-0 nova_compute[192698]: 2025-10-01 14:07:00.463 2 DEBUG oslo_concurrency.processutils [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c5470fba-81f4-4592-8b40-1027a4dc1c83/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:07:00 compute-0 nova_compute[192698]: 2025-10-01 14:07:00.465 2 DEBUG nova.virt.disk.api [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Cannot resize image /var/lib/nova/instances/c5470fba-81f4-4592-8b40-1027a4dc1c83/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 01 14:07:00 compute-0 nova_compute[192698]: 2025-10-01 14:07:00.466 2 DEBUG nova.objects.instance [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lazy-loading 'migration_context' on Instance uuid c5470fba-81f4-4592-8b40-1027a4dc1c83 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:07:00 compute-0 nova_compute[192698]: 2025-10-01 14:07:00.982 2 DEBUG nova.objects.base [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Object Instance<c5470fba-81f4-4592-8b40-1027a4dc1c83> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Oct 01 14:07:00 compute-0 nova_compute[192698]: 2025-10-01 14:07:00.984 2 DEBUG oslo_concurrency.processutils [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/c5470fba-81f4-4592-8b40-1027a4dc1c83/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:07:01 compute-0 nova_compute[192698]: 2025-10-01 14:07:01.028 2 DEBUG oslo_concurrency.processutils [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/c5470fba-81f4-4592-8b40-1027a4dc1c83/disk.config 497664" returned: 0 in 0.045s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:07:01 compute-0 nova_compute[192698]: 2025-10-01 14:07:01.030 2 DEBUG nova.virt.libvirt.driver [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: c5470fba-81f4-4592-8b40-1027a4dc1c83] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Oct 01 14:07:01 compute-0 nova_compute[192698]: 2025-10-01 14:07:01.032 2 DEBUG nova.virt.libvirt.vif [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-10-01T14:05:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1628220963',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1628220963',id=6,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-01T14:05:31Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='67079b4774294271895bbf7b04f602e7',ramdisk_id='',reservation_id='r-dq548l48',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-2075848047',owner_user_name='tempest-TestExecuteActionsViaActuator-2075848047-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-10-01T14:05:31Z,user_data=None,user_id='82619989ef1f48a39f1c1e7d64e4cb38',uuid=c5470fba-81f4-4592-8b40-1027a4dc1c83,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bae9bb47-22fa-49ee-9b7e-fc3a13b33880", "address": "fa:16:3e:1d:07:29", "network": {"id": "e35f096a-fd75-4d70-ae58-8a76ae666b9d", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1299231587-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b14b3910fae84828afa468e1e645402b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapbae9bb47-22", "ovs_interfaceid": "bae9bb47-22fa-49ee-9b7e-fc3a13b33880", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 01 14:07:01 compute-0 nova_compute[192698]: 2025-10-01 14:07:01.032 2 DEBUG nova.network.os_vif_util [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Converting VIF {"id": "bae9bb47-22fa-49ee-9b7e-fc3a13b33880", "address": "fa:16:3e:1d:07:29", "network": {"id": "e35f096a-fd75-4d70-ae58-8a76ae666b9d", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1299231587-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b14b3910fae84828afa468e1e645402b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapbae9bb47-22", "ovs_interfaceid": "bae9bb47-22fa-49ee-9b7e-fc3a13b33880", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:07:01 compute-0 nova_compute[192698]: 2025-10-01 14:07:01.034 2 DEBUG nova.network.os_vif_util [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1d:07:29,bridge_name='br-int',has_traffic_filtering=True,id=bae9bb47-22fa-49ee-9b7e-fc3a13b33880,network=Network(e35f096a-fd75-4d70-ae58-8a76ae666b9d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbae9bb47-22') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:07:01 compute-0 nova_compute[192698]: 2025-10-01 14:07:01.035 2 DEBUG os_vif [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1d:07:29,bridge_name='br-int',has_traffic_filtering=True,id=bae9bb47-22fa-49ee-9b7e-fc3a13b33880,network=Network(e35f096a-fd75-4d70-ae58-8a76ae666b9d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbae9bb47-22') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 01 14:07:01 compute-0 nova_compute[192698]: 2025-10-01 14:07:01.036 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:01 compute-0 nova_compute[192698]: 2025-10-01 14:07:01.036 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:07:01 compute-0 nova_compute[192698]: 2025-10-01 14:07:01.037 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:07:01 compute-0 nova_compute[192698]: 2025-10-01 14:07:01.038 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:01 compute-0 nova_compute[192698]: 2025-10-01 14:07:01.038 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '3b2effd2-e2ba-5ad3-ac8d-e6c964291de4', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:07:01 compute-0 nova_compute[192698]: 2025-10-01 14:07:01.040 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:01 compute-0 nova_compute[192698]: 2025-10-01 14:07:01.042 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:07:01 compute-0 nova_compute[192698]: 2025-10-01 14:07:01.043 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:01 compute-0 nova_compute[192698]: 2025-10-01 14:07:01.048 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:01 compute-0 nova_compute[192698]: 2025-10-01 14:07:01.049 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbae9bb47-22, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:07:01 compute-0 nova_compute[192698]: 2025-10-01 14:07:01.050 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapbae9bb47-22, col_values=(('qos', UUID('477eab30-b842-48e3-b3a3-d66fb7d45b53')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:07:01 compute-0 nova_compute[192698]: 2025-10-01 14:07:01.050 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapbae9bb47-22, col_values=(('external_ids', {'iface-id': 'bae9bb47-22fa-49ee-9b7e-fc3a13b33880', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1d:07:29', 'vm-uuid': 'c5470fba-81f4-4592-8b40-1027a4dc1c83'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:07:01 compute-0 nova_compute[192698]: 2025-10-01 14:07:01.052 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:01 compute-0 NetworkManager[51741]: <info>  [1759327621.0535] manager: (tapbae9bb47-22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/34)
Oct 01 14:07:01 compute-0 nova_compute[192698]: 2025-10-01 14:07:01.055 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:07:01 compute-0 nova_compute[192698]: 2025-10-01 14:07:01.065 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:01 compute-0 nova_compute[192698]: 2025-10-01 14:07:01.066 2 INFO os_vif [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1d:07:29,bridge_name='br-int',has_traffic_filtering=True,id=bae9bb47-22fa-49ee-9b7e-fc3a13b33880,network=Network(e35f096a-fd75-4d70-ae58-8a76ae666b9d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbae9bb47-22')
Oct 01 14:07:01 compute-0 nova_compute[192698]: 2025-10-01 14:07:01.068 2 DEBUG nova.virt.libvirt.driver [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Oct 01 14:07:01 compute-0 nova_compute[192698]: 2025-10-01 14:07:01.068 2 DEBUG nova.compute.manager [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpkspouht5',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='c5470fba-81f4-4592-8b40-1027a4dc1c83',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Oct 01 14:07:01 compute-0 nova_compute[192698]: 2025-10-01 14:07:01.070 2 WARNING neutronclient.v2_0.client [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:07:01 compute-0 sshd-session[217381]: Accepted publickey for nova from 192.168.122.101 port 49498 ssh2: ECDSA SHA256:CTpwsitmS+rJgEXvcdw8I+MI7CMoaMbIOq3Cw9WFkYA
Oct 01 14:07:01 compute-0 systemd-logind[791]: New session 32 of user nova.
Oct 01 14:07:01 compute-0 systemd[1]: Started Session 32 of User nova.
Oct 01 14:07:01 compute-0 sshd-session[217381]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Oct 01 14:07:01 compute-0 openstack_network_exporter[205307]: ERROR   14:07:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:07:01 compute-0 openstack_network_exporter[205307]: ERROR   14:07:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:07:01 compute-0 openstack_network_exporter[205307]: ERROR   14:07:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:07:01 compute-0 openstack_network_exporter[205307]: ERROR   14:07:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:07:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:07:01 compute-0 openstack_network_exporter[205307]: ERROR   14:07:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:07:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:07:01 compute-0 sshd-session[217384]: Received disconnect from 192.168.122.101 port 49498:11: disconnected by user
Oct 01 14:07:01 compute-0 sshd-session[217384]: Disconnected from user nova 192.168.122.101 port 49498
Oct 01 14:07:01 compute-0 sshd-session[217381]: pam_unix(sshd:session): session closed for user nova
Oct 01 14:07:01 compute-0 systemd[1]: session-32.scope: Deactivated successfully.
Oct 01 14:07:01 compute-0 systemd-logind[791]: Session 32 logged out. Waiting for processes to exit.
Oct 01 14:07:01 compute-0 systemd-logind[791]: Removed session 32.
Oct 01 14:07:01 compute-0 sshd-session[217386]: Accepted publickey for nova from 192.168.122.101 port 49508 ssh2: ECDSA SHA256:CTpwsitmS+rJgEXvcdw8I+MI7CMoaMbIOq3Cw9WFkYA
Oct 01 14:07:01 compute-0 systemd-logind[791]: New session 33 of user nova.
Oct 01 14:07:02 compute-0 systemd[1]: Started Session 33 of User nova.
Oct 01 14:07:02 compute-0 sshd-session[217386]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Oct 01 14:07:02 compute-0 nova_compute[192698]: 2025-10-01 14:07:02.018 2 WARNING neutronclient.v2_0.client [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:07:02 compute-0 sshd-session[217389]: Received disconnect from 192.168.122.101 port 49508:11: disconnected by user
Oct 01 14:07:02 compute-0 sshd-session[217389]: Disconnected from user nova 192.168.122.101 port 49508
Oct 01 14:07:02 compute-0 sshd-session[217386]: pam_unix(sshd:session): session closed for user nova
Oct 01 14:07:02 compute-0 systemd[1]: session-33.scope: Deactivated successfully.
Oct 01 14:07:02 compute-0 systemd-logind[791]: Session 33 logged out. Waiting for processes to exit.
Oct 01 14:07:02 compute-0 systemd-logind[791]: Removed session 33.
Oct 01 14:07:02 compute-0 sshd-session[217391]: Accepted publickey for nova from 192.168.122.101 port 49522 ssh2: ECDSA SHA256:CTpwsitmS+rJgEXvcdw8I+MI7CMoaMbIOq3Cw9WFkYA
Oct 01 14:07:02 compute-0 systemd-logind[791]: New session 34 of user nova.
Oct 01 14:07:02 compute-0 systemd[1]: Started Session 34 of User nova.
Oct 01 14:07:02 compute-0 sshd-session[217391]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Oct 01 14:07:02 compute-0 sshd-session[217394]: Received disconnect from 192.168.122.101 port 49522:11: disconnected by user
Oct 01 14:07:02 compute-0 sshd-session[217394]: Disconnected from user nova 192.168.122.101 port 49522
Oct 01 14:07:02 compute-0 sshd-session[217391]: pam_unix(sshd:session): session closed for user nova
Oct 01 14:07:02 compute-0 systemd[1]: session-34.scope: Deactivated successfully.
Oct 01 14:07:02 compute-0 systemd-logind[791]: Session 34 logged out. Waiting for processes to exit.
Oct 01 14:07:02 compute-0 systemd-logind[791]: Removed session 34.
Oct 01 14:07:02 compute-0 nova_compute[192698]: 2025-10-01 14:07:02.414 2 DEBUG nova.compute.manager [req-1dfb01d8-1aa7-4736-9396-b9a4cdc46ae5 req-d3e9896f-7227-44c8-93b3-9a012dd89ab5 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: cc19e5cf-bf34-4a91-a2d7-519421be8b85] Received event network-vif-unplugged-9eb8c749-13c4-43a1-8edd-b95f31dba7be external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:07:02 compute-0 nova_compute[192698]: 2025-10-01 14:07:02.415 2 DEBUG oslo_concurrency.lockutils [req-1dfb01d8-1aa7-4736-9396-b9a4cdc46ae5 req-d3e9896f-7227-44c8-93b3-9a012dd89ab5 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "cc19e5cf-bf34-4a91-a2d7-519421be8b85-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:07:02 compute-0 nova_compute[192698]: 2025-10-01 14:07:02.415 2 DEBUG oslo_concurrency.lockutils [req-1dfb01d8-1aa7-4736-9396-b9a4cdc46ae5 req-d3e9896f-7227-44c8-93b3-9a012dd89ab5 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "cc19e5cf-bf34-4a91-a2d7-519421be8b85-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:07:02 compute-0 nova_compute[192698]: 2025-10-01 14:07:02.415 2 DEBUG oslo_concurrency.lockutils [req-1dfb01d8-1aa7-4736-9396-b9a4cdc46ae5 req-d3e9896f-7227-44c8-93b3-9a012dd89ab5 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "cc19e5cf-bf34-4a91-a2d7-519421be8b85-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:07:02 compute-0 nova_compute[192698]: 2025-10-01 14:07:02.416 2 DEBUG nova.compute.manager [req-1dfb01d8-1aa7-4736-9396-b9a4cdc46ae5 req-d3e9896f-7227-44c8-93b3-9a012dd89ab5 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: cc19e5cf-bf34-4a91-a2d7-519421be8b85] No waiting events found dispatching network-vif-unplugged-9eb8c749-13c4-43a1-8edd-b95f31dba7be pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:07:02 compute-0 nova_compute[192698]: 2025-10-01 14:07:02.416 2 WARNING nova.compute.manager [req-1dfb01d8-1aa7-4736-9396-b9a4cdc46ae5 req-d3e9896f-7227-44c8-93b3-9a012dd89ab5 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: cc19e5cf-bf34-4a91-a2d7-519421be8b85] Received unexpected event network-vif-unplugged-9eb8c749-13c4-43a1-8edd-b95f31dba7be for instance with vm_state active and task_state resize_migrating.
Oct 01 14:07:02 compute-0 nova_compute[192698]: 2025-10-01 14:07:02.523 2 DEBUG nova.network.neutron [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: c5470fba-81f4-4592-8b40-1027a4dc1c83] Port bae9bb47-22fa-49ee-9b7e-fc3a13b33880 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Oct 01 14:07:02 compute-0 nova_compute[192698]: 2025-10-01 14:07:02.538 2 DEBUG nova.compute.manager [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpkspouht5',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='c5470fba-81f4-4592-8b40-1027a4dc1c83',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Oct 01 14:07:02 compute-0 nova_compute[192698]: 2025-10-01 14:07:02.772 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:04 compute-0 nova_compute[192698]: 2025-10-01 14:07:04.504 2 WARNING neutronclient.v2_0.client [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:07:05 compute-0 nova_compute[192698]: 2025-10-01 14:07:05.026 2 INFO nova.network.neutron [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: cc19e5cf-bf34-4a91-a2d7-519421be8b85] Updating port 9eb8c749-13c4-43a1-8edd-b95f31dba7be with attributes {'binding:host_id': 'compute-0.ctlplane.example.com', 'device_owner': 'compute:nova'}
Oct 01 14:07:05 compute-0 nova_compute[192698]: 2025-10-01 14:07:05.518 2 DEBUG oslo_concurrency.lockutils [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "refresh_cache-cc19e5cf-bf34-4a91-a2d7-519421be8b85" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 01 14:07:05 compute-0 nova_compute[192698]: 2025-10-01 14:07:05.518 2 DEBUG oslo_concurrency.lockutils [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquired lock "refresh_cache-cc19e5cf-bf34-4a91-a2d7-519421be8b85" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 01 14:07:05 compute-0 nova_compute[192698]: 2025-10-01 14:07:05.518 2 DEBUG nova.network.neutron [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: cc19e5cf-bf34-4a91-a2d7-519421be8b85] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 01 14:07:05 compute-0 nova_compute[192698]: 2025-10-01 14:07:05.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:07:06 compute-0 nova_compute[192698]: 2025-10-01 14:07:06.029 2 WARNING neutronclient.v2_0.client [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:07:06 compute-0 nova_compute[192698]: 2025-10-01 14:07:06.053 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:06 compute-0 systemd[1]: Starting libvirt proxy daemon...
Oct 01 14:07:06 compute-0 systemd[1]: Started libvirt proxy daemon.
Oct 01 14:07:06 compute-0 podman[217397]: 2025-10-01 14:07:06.198426195 +0000 UTC m=+0.097396206 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 01 14:07:06 compute-0 kernel: tapbae9bb47-22: entered promiscuous mode
Oct 01 14:07:06 compute-0 NetworkManager[51741]: <info>  [1759327626.3109] manager: (tapbae9bb47-22): new Tun device (/org/freedesktop/NetworkManager/Devices/35)
Oct 01 14:07:06 compute-0 nova_compute[192698]: 2025-10-01 14:07:06.312 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:06 compute-0 ovn_controller[94909]: 2025-10-01T14:07:06Z|00062|binding|INFO|Claiming lport bae9bb47-22fa-49ee-9b7e-fc3a13b33880 for this additional chassis.
Oct 01 14:07:06 compute-0 ovn_controller[94909]: 2025-10-01T14:07:06Z|00063|binding|INFO|bae9bb47-22fa-49ee-9b7e-fc3a13b33880: Claiming fa:16:3e:1d:07:29 10.100.0.10
Oct 01 14:07:06 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:06.342 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1d:07:29 10.100.0.10'], port_security=['fa:16:3e:1d:07:29 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'c5470fba-81f4-4592-8b40-1027a4dc1c83', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e35f096a-fd75-4d70-ae58-8a76ae666b9d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '67079b4774294271895bbf7b04f602e7', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'de7872a1-1f76-4b0f-8bd9-119520ff7a88', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e3a455d-1f77-441e-b08a-0ec8231910e5, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[], logical_port=bae9bb47-22fa-49ee-9b7e-fc3a13b33880) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:07:06 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:06.344 103791 INFO neutron.agent.ovn.metadata.agent [-] Port bae9bb47-22fa-49ee-9b7e-fc3a13b33880 in datapath e35f096a-fd75-4d70-ae58-8a76ae666b9d unbound from our chassis
Oct 01 14:07:06 compute-0 nova_compute[192698]: 2025-10-01 14:07:06.345 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:06 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:06.347 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e35f096a-fd75-4d70-ae58-8a76ae666b9d
Oct 01 14:07:06 compute-0 nova_compute[192698]: 2025-10-01 14:07:06.354 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:06 compute-0 ovn_controller[94909]: 2025-10-01T14:07:06Z|00064|binding|INFO|Setting lport bae9bb47-22fa-49ee-9b7e-fc3a13b33880 ovn-installed in OVS
Oct 01 14:07:06 compute-0 systemd-udevd[217452]: Network interface NamePolicy= disabled on kernel command line.
Oct 01 14:07:06 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:06.373 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[84f7fe5d-9b1e-4774-a8e0-8a936829115b]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:07:06 compute-0 systemd-machined[152704]: New machine qemu-5-instance-00000006.
Oct 01 14:07:06 compute-0 NetworkManager[51741]: <info>  [1759327626.3880] device (tapbae9bb47-22): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 01 14:07:06 compute-0 NetworkManager[51741]: <info>  [1759327626.3901] device (tapbae9bb47-22): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 01 14:07:06 compute-0 systemd[1]: Started Virtual Machine qemu-5-instance-00000006.
Oct 01 14:07:06 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:06.428 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[50aed859-a63f-48f2-ab24-8f0ab2b0ffa1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:07:06 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:06.432 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[e336b879-9ad0-4ca1-af9c-98781055d8c8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:07:06 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:06.481 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[fec3b145-4a37-4466-8b55-f793a871224f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:07:06 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:06.507 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[6777571f-2cdf-4f6b-8567-360bbde8a241]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape35f096a-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:47:1b:50'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 14, 'tx_packets': 10, 'rx_bytes': 1084, 'tx_bytes': 612, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 14, 'tx_packets': 10, 'rx_bytes': 1084, 'tx_bytes': 612, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 382912, 'reachable_time': 20041, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217466, 'error': None, 'target': 'ovnmeta-e35f096a-fd75-4d70-ae58-8a76ae666b9d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:07:06 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:06.535 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[12b1e6c3-c967-4d44-9c42-98206b70eb90]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tape35f096a-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 382931, 'tstamp': 382931}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217468, 'error': None, 'target': 'ovnmeta-e35f096a-fd75-4d70-ae58-8a76ae666b9d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tape35f096a-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 382936, 'tstamp': 382936}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217468, 'error': None, 'target': 'ovnmeta-e35f096a-fd75-4d70-ae58-8a76ae666b9d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:07:06 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:06.536 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape35f096a-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:07:06 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:06.582 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape35f096a-f0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:07:06 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:06.583 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:07:06 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:06.583 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape35f096a-f0, col_values=(('external_ids', {'iface-id': '3f9111f1-79b1-4bf1-bb95-d924c71fb42c'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:07:06 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:06.583 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:07:06 compute-0 nova_compute[192698]: 2025-10-01 14:07:06.583 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:06 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:06.585 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[5feeaa40-9329-4cad-a697-598466932baa]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-e35f096a-fd75-4d70-ae58-8a76ae666b9d\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/e35f096a-fd75-4d70-ae58-8a76ae666b9d.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID e35f096a-fd75-4d70-ae58-8a76ae666b9d\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:07:06 compute-0 nova_compute[192698]: 2025-10-01 14:07:06.638 2 DEBUG nova.compute.manager [req-b8eb9041-c914-49fc-98d9-b7dcb39320ff req-8e38a645-f12e-4264-82aa-448fa06e290e 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: cc19e5cf-bf34-4a91-a2d7-519421be8b85] Received event network-changed-9eb8c749-13c4-43a1-8edd-b95f31dba7be external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:07:06 compute-0 nova_compute[192698]: 2025-10-01 14:07:06.639 2 DEBUG nova.compute.manager [req-b8eb9041-c914-49fc-98d9-b7dcb39320ff req-8e38a645-f12e-4264-82aa-448fa06e290e 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: cc19e5cf-bf34-4a91-a2d7-519421be8b85] Refreshing instance network info cache due to event network-changed-9eb8c749-13c4-43a1-8edd-b95f31dba7be. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Oct 01 14:07:06 compute-0 nova_compute[192698]: 2025-10-01 14:07:06.639 2 DEBUG oslo_concurrency.lockutils [req-b8eb9041-c914-49fc-98d9-b7dcb39320ff req-8e38a645-f12e-4264-82aa-448fa06e290e 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "refresh_cache-cc19e5cf-bf34-4a91-a2d7-519421be8b85" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 01 14:07:07 compute-0 nova_compute[192698]: 2025-10-01 14:07:07.002 2 WARNING neutronclient.v2_0.client [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:07:07 compute-0 nova_compute[192698]: 2025-10-01 14:07:07.253 2 DEBUG nova.network.neutron [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: cc19e5cf-bf34-4a91-a2d7-519421be8b85] Updating instance_info_cache with network_info: [{"id": "9eb8c749-13c4-43a1-8edd-b95f31dba7be", "address": "fa:16:3e:ae:eb:01", "network": {"id": "e35f096a-fd75-4d70-ae58-8a76ae666b9d", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1299231587-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b14b3910fae84828afa468e1e645402b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9eb8c749-13", "ovs_interfaceid": "9eb8c749-13c4-43a1-8edd-b95f31dba7be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:07:07 compute-0 nova_compute[192698]: 2025-10-01 14:07:07.763 2 DEBUG oslo_concurrency.lockutils [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Releasing lock "refresh_cache-cc19e5cf-bf34-4a91-a2d7-519421be8b85" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 01 14:07:07 compute-0 nova_compute[192698]: 2025-10-01 14:07:07.774 2 DEBUG oslo_concurrency.lockutils [req-b8eb9041-c914-49fc-98d9-b7dcb39320ff req-8e38a645-f12e-4264-82aa-448fa06e290e 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquired lock "refresh_cache-cc19e5cf-bf34-4a91-a2d7-519421be8b85" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 01 14:07:07 compute-0 nova_compute[192698]: 2025-10-01 14:07:07.774 2 DEBUG nova.network.neutron [req-b8eb9041-c914-49fc-98d9-b7dcb39320ff req-8e38a645-f12e-4264-82aa-448fa06e290e 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: cc19e5cf-bf34-4a91-a2d7-519421be8b85] Refreshing network info cache for port 9eb8c749-13c4-43a1-8edd-b95f31dba7be _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Oct 01 14:07:07 compute-0 nova_compute[192698]: 2025-10-01 14:07:07.830 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:08 compute-0 nova_compute[192698]: 2025-10-01 14:07:08.379 2 WARNING neutronclient.v2_0.client [req-b8eb9041-c914-49fc-98d9-b7dcb39320ff req-8e38a645-f12e-4264-82aa-448fa06e290e 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:07:08 compute-0 nova_compute[192698]: 2025-10-01 14:07:08.396 2 DEBUG nova.virt.libvirt.driver [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: cc19e5cf-bf34-4a91-a2d7-519421be8b85] Starting finish_migration finish_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12604
Oct 01 14:07:08 compute-0 nova_compute[192698]: 2025-10-01 14:07:08.399 2 DEBUG nova.virt.libvirt.driver [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: cc19e5cf-bf34-4a91-a2d7-519421be8b85] Instance directory exists: not creating _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5134
Oct 01 14:07:08 compute-0 nova_compute[192698]: 2025-10-01 14:07:08.400 2 INFO nova.virt.libvirt.driver [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: cc19e5cf-bf34-4a91-a2d7-519421be8b85] Creating image(s)
Oct 01 14:07:08 compute-0 nova_compute[192698]: 2025-10-01 14:07:08.401 2 DEBUG nova.objects.instance [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lazy-loading 'trusted_certs' on Instance uuid cc19e5cf-bf34-4a91-a2d7-519421be8b85 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:07:08 compute-0 nova_compute[192698]: 2025-10-01 14:07:08.913 2 DEBUG oslo_concurrency.processutils [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:07:08 compute-0 nova_compute[192698]: 2025-10-01 14:07:08.929 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:07:08 compute-0 nova_compute[192698]: 2025-10-01 14:07:08.935 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:07:09 compute-0 nova_compute[192698]: 2025-10-01 14:07:09.003 2 DEBUG oslo_concurrency.processutils [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:07:09 compute-0 nova_compute[192698]: 2025-10-01 14:07:09.004 2 DEBUG nova.virt.disk.api [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Checking if we can resize image /var/lib/nova/instances/cc19e5cf-bf34-4a91-a2d7-519421be8b85/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 01 14:07:09 compute-0 nova_compute[192698]: 2025-10-01 14:07:09.005 2 DEBUG oslo_concurrency.processutils [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cc19e5cf-bf34-4a91-a2d7-519421be8b85/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:07:09 compute-0 nova_compute[192698]: 2025-10-01 14:07:09.092 2 DEBUG oslo_concurrency.processutils [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cc19e5cf-bf34-4a91-a2d7-519421be8b85/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:07:09 compute-0 nova_compute[192698]: 2025-10-01 14:07:09.094 2 DEBUG nova.virt.disk.api [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Cannot resize image /var/lib/nova/instances/cc19e5cf-bf34-4a91-a2d7-519421be8b85/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 01 14:07:09 compute-0 nova_compute[192698]: 2025-10-01 14:07:09.449 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:07:09 compute-0 nova_compute[192698]: 2025-10-01 14:07:09.450 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:07:09 compute-0 nova_compute[192698]: 2025-10-01 14:07:09.450 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:07:09 compute-0 nova_compute[192698]: 2025-10-01 14:07:09.450 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 01 14:07:09 compute-0 nova_compute[192698]: 2025-10-01 14:07:09.452 2 WARNING neutronclient.v2_0.client [req-b8eb9041-c914-49fc-98d9-b7dcb39320ff req-8e38a645-f12e-4264-82aa-448fa06e290e 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:07:09 compute-0 ovn_controller[94909]: 2025-10-01T14:07:09Z|00065|binding|INFO|Claiming lport bae9bb47-22fa-49ee-9b7e-fc3a13b33880 for this chassis.
Oct 01 14:07:09 compute-0 ovn_controller[94909]: 2025-10-01T14:07:09Z|00066|binding|INFO|bae9bb47-22fa-49ee-9b7e-fc3a13b33880: Claiming fa:16:3e:1d:07:29 10.100.0.10
Oct 01 14:07:09 compute-0 nova_compute[192698]: 2025-10-01 14:07:09.604 2 DEBUG nova.virt.libvirt.driver [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: cc19e5cf-bf34-4a91-a2d7-519421be8b85] Did not create local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5272
Oct 01 14:07:09 compute-0 nova_compute[192698]: 2025-10-01 14:07:09.605 2 DEBUG nova.virt.libvirt.driver [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: cc19e5cf-bf34-4a91-a2d7-519421be8b85] Ensure instance console log exists: /var/lib/nova/instances/cc19e5cf-bf34-4a91-a2d7-519421be8b85/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Oct 01 14:07:09 compute-0 ovn_controller[94909]: 2025-10-01T14:07:09Z|00067|binding|INFO|Setting lport bae9bb47-22fa-49ee-9b7e-fc3a13b33880 up in Southbound
Oct 01 14:07:09 compute-0 nova_compute[192698]: 2025-10-01 14:07:09.606 2 DEBUG oslo_concurrency.lockutils [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:07:09 compute-0 nova_compute[192698]: 2025-10-01 14:07:09.606 2 DEBUG oslo_concurrency.lockutils [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:07:09 compute-0 nova_compute[192698]: 2025-10-01 14:07:09.607 2 DEBUG oslo_concurrency.lockutils [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:07:09 compute-0 nova_compute[192698]: 2025-10-01 14:07:09.611 2 DEBUG nova.virt.libvirt.driver [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: cc19e5cf-bf34-4a91-a2d7-519421be8b85] Start _get_guest_xml network_info=[{"id": "9eb8c749-13c4-43a1-8edd-b95f31dba7be", "address": "fa:16:3e:ae:eb:01", "network": {"id": "e35f096a-fd75-4d70-ae58-8a76ae666b9d", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1299231587-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1299231587-network", "vif_mac": "fa:16:3e:ae:eb:01"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b14b3910fae84828afa468e1e645402b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9eb8c749-13", "ovs_interfaceid": "9eb8c749-13c4-43a1-8edd-b95f31dba7be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-01T13:57:39Z,direct_url=<?>,disk_format='qcow2',id=48696e9b-a20d-4bf6-8ac2-6438fe748ab6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9dacac6049d34f02846f752af09ae16f',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-01T13:57:40Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encrypted': False, 'encryption_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_options': None, 'device_name': '/dev/vda', 'guest_format': None, 'image_id': '48696e9b-a20d-4bf6-8ac2-6438fe748ab6'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Oct 01 14:07:09 compute-0 nova_compute[192698]: 2025-10-01 14:07:09.619 2 WARNING nova.virt.libvirt.driver [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:07:09 compute-0 nova_compute[192698]: 2025-10-01 14:07:09.622 2 DEBUG nova.virt.driver [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='48696e9b-a20d-4bf6-8ac2-6438fe748ab6', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteActionsViaActuator-server-290041760', uuid='cc19e5cf-bf34-4a91-a2d7-519421be8b85'), owner=OwnerMeta(userid='82619989ef1f48a39f1c1e7d64e4cb38', username='tempest-TestExecuteActionsViaActuator-2075848047-project-admin', projectid='67079b4774294271895bbf7b04f602e7', projectname='tempest-TestExecuteActionsViaActuator-2075848047'), image=ImageMeta(id='48696e9b-a20d-4bf6-8ac2-6438fe748ab6', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_cdrom_bus': 'sata', 'hw_disk_bus': 'virtio', 'hw_input_bus': 'usb', 'hw_machine_type': 'q35', 'hw_pointer_model': 'usbtablet', 'hw_rng_model': 'virtio', 'hw_video_model': 'virtio', 'hw_vif_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='69702c4b-38f2-49d1-96d5-85671652c67e', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "9eb8c749-13c4-43a1-8edd-b95f31dba7be", "address": "fa:16:3e:ae:eb:01", "network": {"id": "e35f096a-fd75-4d70-ae58-8a76ae666b9d", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1299231587-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1299231587-network", "vif_mac": "fa:16:3e:ae:eb:01"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b14b3910fae84828afa468e1e645402b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9eb8c749-13", "ovs_interfaceid": "9eb8c749-13c4-43a1-8edd-b95f31dba7be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759327629.6225417) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Oct 01 14:07:09 compute-0 nova_compute[192698]: 2025-10-01 14:07:09.628 2 DEBUG nova.virt.libvirt.host [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Oct 01 14:07:09 compute-0 nova_compute[192698]: 2025-10-01 14:07:09.630 2 DEBUG nova.virt.libvirt.host [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Oct 01 14:07:09 compute-0 nova_compute[192698]: 2025-10-01 14:07:09.635 2 DEBUG nova.virt.libvirt.host [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Oct 01 14:07:09 compute-0 nova_compute[192698]: 2025-10-01 14:07:09.635 2 DEBUG nova.virt.libvirt.host [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Oct 01 14:07:09 compute-0 nova_compute[192698]: 2025-10-01 14:07:09.637 2 DEBUG nova.virt.libvirt.driver [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Oct 01 14:07:09 compute-0 nova_compute[192698]: 2025-10-01 14:07:09.637 2 DEBUG nova.virt.hardware [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-01T13:57:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='69702c4b-38f2-49d1-96d5-85671652c67e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-01T13:57:39Z,direct_url=<?>,disk_format='qcow2',id=48696e9b-a20d-4bf6-8ac2-6438fe748ab6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9dacac6049d34f02846f752af09ae16f',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-01T13:57:40Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Oct 01 14:07:09 compute-0 nova_compute[192698]: 2025-10-01 14:07:09.639 2 DEBUG nova.virt.hardware [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Oct 01 14:07:09 compute-0 nova_compute[192698]: 2025-10-01 14:07:09.639 2 DEBUG nova.virt.hardware [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Oct 01 14:07:09 compute-0 nova_compute[192698]: 2025-10-01 14:07:09.639 2 DEBUG nova.virt.hardware [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Oct 01 14:07:09 compute-0 nova_compute[192698]: 2025-10-01 14:07:09.640 2 DEBUG nova.virt.hardware [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Oct 01 14:07:09 compute-0 nova_compute[192698]: 2025-10-01 14:07:09.640 2 DEBUG nova.virt.hardware [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Oct 01 14:07:09 compute-0 nova_compute[192698]: 2025-10-01 14:07:09.641 2 DEBUG nova.virt.hardware [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Oct 01 14:07:09 compute-0 nova_compute[192698]: 2025-10-01 14:07:09.641 2 DEBUG nova.virt.hardware [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Oct 01 14:07:09 compute-0 nova_compute[192698]: 2025-10-01 14:07:09.642 2 DEBUG nova.virt.hardware [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Oct 01 14:07:09 compute-0 nova_compute[192698]: 2025-10-01 14:07:09.642 2 DEBUG nova.virt.hardware [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Oct 01 14:07:09 compute-0 nova_compute[192698]: 2025-10-01 14:07:09.643 2 DEBUG nova.virt.hardware [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Oct 01 14:07:09 compute-0 nova_compute[192698]: 2025-10-01 14:07:09.643 2 DEBUG nova.objects.instance [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lazy-loading 'vcpu_model' on Instance uuid cc19e5cf-bf34-4a91-a2d7-519421be8b85 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:07:09 compute-0 nova_compute[192698]: 2025-10-01 14:07:09.695 2 DEBUG nova.network.neutron [req-b8eb9041-c914-49fc-98d9-b7dcb39320ff req-8e38a645-f12e-4264-82aa-448fa06e290e 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: cc19e5cf-bf34-4a91-a2d7-519421be8b85] Updated VIF entry in instance network info cache for port 9eb8c749-13c4-43a1-8edd-b95f31dba7be. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Oct 01 14:07:09 compute-0 nova_compute[192698]: 2025-10-01 14:07:09.696 2 DEBUG nova.network.neutron [req-b8eb9041-c914-49fc-98d9-b7dcb39320ff req-8e38a645-f12e-4264-82aa-448fa06e290e 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: cc19e5cf-bf34-4a91-a2d7-519421be8b85] Updating instance_info_cache with network_info: [{"id": "9eb8c749-13c4-43a1-8edd-b95f31dba7be", "address": "fa:16:3e:ae:eb:01", "network": {"id": "e35f096a-fd75-4d70-ae58-8a76ae666b9d", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1299231587-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b14b3910fae84828afa468e1e645402b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9eb8c749-13", "ovs_interfaceid": "9eb8c749-13c4-43a1-8edd-b95f31dba7be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:07:10 compute-0 nova_compute[192698]: 2025-10-01 14:07:10.154 2 DEBUG nova.objects.base [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Object Instance<cc19e5cf-bf34-4a91-a2d7-519421be8b85> lazy-loaded attributes: trusted_certs,vcpu_model wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Oct 01 14:07:10 compute-0 nova_compute[192698]: 2025-10-01 14:07:10.160 2 DEBUG oslo_concurrency.processutils [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cc19e5cf-bf34-4a91-a2d7-519421be8b85/disk.config --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:07:10 compute-0 nova_compute[192698]: 2025-10-01 14:07:10.202 2 DEBUG oslo_concurrency.lockutils [req-b8eb9041-c914-49fc-98d9-b7dcb39320ff req-8e38a645-f12e-4264-82aa-448fa06e290e 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Releasing lock "refresh_cache-cc19e5cf-bf34-4a91-a2d7-519421be8b85" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 01 14:07:10 compute-0 nova_compute[192698]: 2025-10-01 14:07:10.253 2 DEBUG oslo_concurrency.processutils [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cc19e5cf-bf34-4a91-a2d7-519421be8b85/disk.config --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:07:10 compute-0 nova_compute[192698]: 2025-10-01 14:07:10.254 2 DEBUG oslo_concurrency.lockutils [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "/var/lib/nova/instances/cc19e5cf-bf34-4a91-a2d7-519421be8b85/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:07:10 compute-0 nova_compute[192698]: 2025-10-01 14:07:10.255 2 DEBUG oslo_concurrency.lockutils [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "/var/lib/nova/instances/cc19e5cf-bf34-4a91-a2d7-519421be8b85/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:07:10 compute-0 nova_compute[192698]: 2025-10-01 14:07:10.257 2 DEBUG oslo_concurrency.lockutils [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "/var/lib/nova/instances/cc19e5cf-bf34-4a91-a2d7-519421be8b85/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:07:10 compute-0 nova_compute[192698]: 2025-10-01 14:07:10.260 2 DEBUG nova.virt.libvirt.vif [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-10-01T14:06:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-290041760',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-290041760',id=8,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-01T14:06:15Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='67079b4774294271895bbf7b04f602e7',ramdisk_id='',reservation_id='r-nzqp9uwt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-2075848047',owner_user_name='tempest-TestExecuteActionsViaActuator-2075848047-project-admin'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-10-01T14:07:03Z,user_data=None,user_id='82619989ef1f48a39f1c1e7d64e4cb38',uuid=cc19e5cf-bf34-4a91-a2d7-519421be8b85,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9eb8c749-13c4-43a1-8edd-b95f31dba7be", "address": "fa:16:3e:ae:eb:01", "network": {"id": "e35f096a-fd75-4d70-ae58-8a76ae666b9d", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1299231587-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1299231587-network", "vif_mac": "fa:16:3e:ae:eb:01"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b14b3910fae84828afa468e1e645402b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9eb8c749-13", "ovs_interfaceid": "9eb8c749-13c4-43a1-8edd-b95f31dba7be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Oct 01 14:07:10 compute-0 nova_compute[192698]: 2025-10-01 14:07:10.261 2 DEBUG nova.network.os_vif_util [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Converting VIF {"id": "9eb8c749-13c4-43a1-8edd-b95f31dba7be", "address": "fa:16:3e:ae:eb:01", "network": {"id": "e35f096a-fd75-4d70-ae58-8a76ae666b9d", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1299231587-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1299231587-network", "vif_mac": "fa:16:3e:ae:eb:01"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b14b3910fae84828afa468e1e645402b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9eb8c749-13", "ovs_interfaceid": "9eb8c749-13c4-43a1-8edd-b95f31dba7be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:07:10 compute-0 nova_compute[192698]: 2025-10-01 14:07:10.263 2 DEBUG nova.network.os_vif_util [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ae:eb:01,bridge_name='br-int',has_traffic_filtering=True,id=9eb8c749-13c4-43a1-8edd-b95f31dba7be,network=Network(e35f096a-fd75-4d70-ae58-8a76ae666b9d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9eb8c749-13') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:07:10 compute-0 nova_compute[192698]: 2025-10-01 14:07:10.268 2 DEBUG nova.virt.libvirt.driver [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: cc19e5cf-bf34-4a91-a2d7-519421be8b85] End _get_guest_xml xml=<domain type="kvm">
Oct 01 14:07:10 compute-0 nova_compute[192698]:   <uuid>cc19e5cf-bf34-4a91-a2d7-519421be8b85</uuid>
Oct 01 14:07:10 compute-0 nova_compute[192698]:   <name>instance-00000008</name>
Oct 01 14:07:10 compute-0 nova_compute[192698]:   <memory>131072</memory>
Oct 01 14:07:10 compute-0 nova_compute[192698]:   <vcpu>1</vcpu>
Oct 01 14:07:10 compute-0 nova_compute[192698]:   <metadata>
Oct 01 14:07:10 compute-0 nova_compute[192698]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 01 14:07:10 compute-0 nova_compute[192698]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Oct 01 14:07:10 compute-0 nova_compute[192698]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-290041760</nova:name>
Oct 01 14:07:10 compute-0 nova_compute[192698]:       <nova:creationTime>2025-10-01 14:07:09</nova:creationTime>
Oct 01 14:07:10 compute-0 nova_compute[192698]:       <nova:flavor name="m1.nano" id="69702c4b-38f2-49d1-96d5-85671652c67e">
Oct 01 14:07:10 compute-0 nova_compute[192698]:         <nova:memory>128</nova:memory>
Oct 01 14:07:10 compute-0 nova_compute[192698]:         <nova:disk>1</nova:disk>
Oct 01 14:07:10 compute-0 nova_compute[192698]:         <nova:swap>0</nova:swap>
Oct 01 14:07:10 compute-0 nova_compute[192698]:         <nova:ephemeral>0</nova:ephemeral>
Oct 01 14:07:10 compute-0 nova_compute[192698]:         <nova:vcpus>1</nova:vcpus>
Oct 01 14:07:10 compute-0 nova_compute[192698]:         <nova:extraSpecs>
Oct 01 14:07:10 compute-0 nova_compute[192698]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Oct 01 14:07:10 compute-0 nova_compute[192698]:         </nova:extraSpecs>
Oct 01 14:07:10 compute-0 nova_compute[192698]:       </nova:flavor>
Oct 01 14:07:10 compute-0 nova_compute[192698]:       <nova:image uuid="48696e9b-a20d-4bf6-8ac2-6438fe748ab6">
Oct 01 14:07:10 compute-0 nova_compute[192698]:         <nova:containerFormat>bare</nova:containerFormat>
Oct 01 14:07:10 compute-0 nova_compute[192698]:         <nova:diskFormat>qcow2</nova:diskFormat>
Oct 01 14:07:10 compute-0 nova_compute[192698]:         <nova:minDisk>1</nova:minDisk>
Oct 01 14:07:10 compute-0 nova_compute[192698]:         <nova:minRam>0</nova:minRam>
Oct 01 14:07:10 compute-0 nova_compute[192698]:         <nova:properties>
Oct 01 14:07:10 compute-0 nova_compute[192698]:           <nova:property name="hw_cdrom_bus">sata</nova:property>
Oct 01 14:07:10 compute-0 nova_compute[192698]:           <nova:property name="hw_disk_bus">virtio</nova:property>
Oct 01 14:07:10 compute-0 nova_compute[192698]:           <nova:property name="hw_input_bus">usb</nova:property>
Oct 01 14:07:10 compute-0 nova_compute[192698]:           <nova:property name="hw_machine_type">q35</nova:property>
Oct 01 14:07:10 compute-0 nova_compute[192698]:           <nova:property name="hw_pointer_model">usbtablet</nova:property>
Oct 01 14:07:10 compute-0 nova_compute[192698]:           <nova:property name="hw_rng_model">virtio</nova:property>
Oct 01 14:07:10 compute-0 nova_compute[192698]:           <nova:property name="hw_video_model">virtio</nova:property>
Oct 01 14:07:10 compute-0 nova_compute[192698]:           <nova:property name="hw_vif_model">virtio</nova:property>
Oct 01 14:07:10 compute-0 nova_compute[192698]:         </nova:properties>
Oct 01 14:07:10 compute-0 nova_compute[192698]:       </nova:image>
Oct 01 14:07:10 compute-0 nova_compute[192698]:       <nova:owner>
Oct 01 14:07:10 compute-0 nova_compute[192698]:         <nova:user uuid="82619989ef1f48a39f1c1e7d64e4cb38">tempest-TestExecuteActionsViaActuator-2075848047-project-admin</nova:user>
Oct 01 14:07:10 compute-0 nova_compute[192698]:         <nova:project uuid="67079b4774294271895bbf7b04f602e7">tempest-TestExecuteActionsViaActuator-2075848047</nova:project>
Oct 01 14:07:10 compute-0 nova_compute[192698]:       </nova:owner>
Oct 01 14:07:10 compute-0 nova_compute[192698]:       <nova:root type="image" uuid="48696e9b-a20d-4bf6-8ac2-6438fe748ab6"/>
Oct 01 14:07:10 compute-0 nova_compute[192698]:       <nova:ports>
Oct 01 14:07:10 compute-0 nova_compute[192698]:         <nova:port uuid="9eb8c749-13c4-43a1-8edd-b95f31dba7be">
Oct 01 14:07:10 compute-0 nova_compute[192698]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Oct 01 14:07:10 compute-0 nova_compute[192698]:         </nova:port>
Oct 01 14:07:10 compute-0 nova_compute[192698]:       </nova:ports>
Oct 01 14:07:10 compute-0 nova_compute[192698]:     </nova:instance>
Oct 01 14:07:10 compute-0 nova_compute[192698]:   </metadata>
Oct 01 14:07:10 compute-0 nova_compute[192698]:   <sysinfo type="smbios">
Oct 01 14:07:10 compute-0 nova_compute[192698]:     <system>
Oct 01 14:07:10 compute-0 nova_compute[192698]:       <entry name="manufacturer">RDO</entry>
Oct 01 14:07:10 compute-0 nova_compute[192698]:       <entry name="product">OpenStack Compute</entry>
Oct 01 14:07:10 compute-0 nova_compute[192698]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Oct 01 14:07:10 compute-0 nova_compute[192698]:       <entry name="serial">cc19e5cf-bf34-4a91-a2d7-519421be8b85</entry>
Oct 01 14:07:10 compute-0 nova_compute[192698]:       <entry name="uuid">cc19e5cf-bf34-4a91-a2d7-519421be8b85</entry>
Oct 01 14:07:10 compute-0 nova_compute[192698]:       <entry name="family">Virtual Machine</entry>
Oct 01 14:07:10 compute-0 nova_compute[192698]:     </system>
Oct 01 14:07:10 compute-0 nova_compute[192698]:   </sysinfo>
Oct 01 14:07:10 compute-0 nova_compute[192698]:   <os>
Oct 01 14:07:10 compute-0 nova_compute[192698]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 01 14:07:10 compute-0 nova_compute[192698]:     <boot dev="hd"/>
Oct 01 14:07:10 compute-0 nova_compute[192698]:     <smbios mode="sysinfo"/>
Oct 01 14:07:10 compute-0 nova_compute[192698]:   </os>
Oct 01 14:07:10 compute-0 nova_compute[192698]:   <features>
Oct 01 14:07:10 compute-0 nova_compute[192698]:     <acpi/>
Oct 01 14:07:10 compute-0 nova_compute[192698]:     <apic/>
Oct 01 14:07:10 compute-0 nova_compute[192698]:     <vmcoreinfo/>
Oct 01 14:07:10 compute-0 nova_compute[192698]:   </features>
Oct 01 14:07:10 compute-0 nova_compute[192698]:   <clock offset="utc">
Oct 01 14:07:10 compute-0 nova_compute[192698]:     <timer name="pit" tickpolicy="delay"/>
Oct 01 14:07:10 compute-0 nova_compute[192698]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 01 14:07:10 compute-0 nova_compute[192698]:     <timer name="hpet" present="no"/>
Oct 01 14:07:10 compute-0 nova_compute[192698]:   </clock>
Oct 01 14:07:10 compute-0 nova_compute[192698]:   <cpu mode="host-model" match="exact">
Oct 01 14:07:10 compute-0 nova_compute[192698]:     <topology sockets="1" cores="1" threads="1"/>
Oct 01 14:07:10 compute-0 nova_compute[192698]:   </cpu>
Oct 01 14:07:10 compute-0 nova_compute[192698]:   <devices>
Oct 01 14:07:10 compute-0 nova_compute[192698]:     <disk type="file" device="disk">
Oct 01 14:07:10 compute-0 nova_compute[192698]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 01 14:07:10 compute-0 nova_compute[192698]:       <source file="/var/lib/nova/instances/cc19e5cf-bf34-4a91-a2d7-519421be8b85/disk"/>
Oct 01 14:07:10 compute-0 nova_compute[192698]:       <target dev="vda" bus="virtio"/>
Oct 01 14:07:10 compute-0 nova_compute[192698]:     </disk>
Oct 01 14:07:10 compute-0 nova_compute[192698]:     <disk type="file" device="cdrom">
Oct 01 14:07:10 compute-0 nova_compute[192698]:       <driver name="qemu" type="raw" cache="none"/>
Oct 01 14:07:10 compute-0 nova_compute[192698]:       <source file="/var/lib/nova/instances/cc19e5cf-bf34-4a91-a2d7-519421be8b85/disk.config"/>
Oct 01 14:07:10 compute-0 nova_compute[192698]:       <target dev="sda" bus="sata"/>
Oct 01 14:07:10 compute-0 nova_compute[192698]:     </disk>
Oct 01 14:07:10 compute-0 nova_compute[192698]:     <interface type="ethernet">
Oct 01 14:07:10 compute-0 nova_compute[192698]:       <mac address="fa:16:3e:ae:eb:01"/>
Oct 01 14:07:10 compute-0 nova_compute[192698]:       <model type="virtio"/>
Oct 01 14:07:10 compute-0 nova_compute[192698]:       <driver name="vhost" rx_queue_size="512"/>
Oct 01 14:07:10 compute-0 nova_compute[192698]:       <mtu size="1442"/>
Oct 01 14:07:10 compute-0 nova_compute[192698]:       <target dev="tap9eb8c749-13"/>
Oct 01 14:07:10 compute-0 nova_compute[192698]:     </interface>
Oct 01 14:07:10 compute-0 nova_compute[192698]:     <serial type="pty">
Oct 01 14:07:10 compute-0 nova_compute[192698]:       <log file="/var/lib/nova/instances/cc19e5cf-bf34-4a91-a2d7-519421be8b85/console.log" append="off"/>
Oct 01 14:07:10 compute-0 nova_compute[192698]:     </serial>
Oct 01 14:07:10 compute-0 nova_compute[192698]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 01 14:07:10 compute-0 nova_compute[192698]:     <video>
Oct 01 14:07:10 compute-0 nova_compute[192698]:       <model type="virtio"/>
Oct 01 14:07:10 compute-0 nova_compute[192698]:     </video>
Oct 01 14:07:10 compute-0 nova_compute[192698]:     <input type="tablet" bus="usb"/>
Oct 01 14:07:10 compute-0 nova_compute[192698]:     <rng model="virtio">
Oct 01 14:07:10 compute-0 nova_compute[192698]:       <backend model="random">/dev/urandom</backend>
Oct 01 14:07:10 compute-0 nova_compute[192698]:     </rng>
Oct 01 14:07:10 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root"/>
Oct 01 14:07:10 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:07:10 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:07:10 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:07:10 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:07:10 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:07:10 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:07:10 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:07:10 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:07:10 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:07:10 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:07:10 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:07:10 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:07:10 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:07:10 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:07:10 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:07:10 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:07:10 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:07:10 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:07:10 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:07:10 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:07:10 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:07:10 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:07:10 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:07:10 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:07:10 compute-0 nova_compute[192698]:     <controller type="usb" index="0"/>
Oct 01 14:07:10 compute-0 nova_compute[192698]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Oct 01 14:07:10 compute-0 nova_compute[192698]:       <stats period="10"/>
Oct 01 14:07:10 compute-0 nova_compute[192698]:     </memballoon>
Oct 01 14:07:10 compute-0 nova_compute[192698]:   </devices>
Oct 01 14:07:10 compute-0 nova_compute[192698]: </domain>
Oct 01 14:07:10 compute-0 nova_compute[192698]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Oct 01 14:07:10 compute-0 nova_compute[192698]: 2025-10-01 14:07:10.271 2 DEBUG nova.virt.libvirt.vif [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-10-01T14:06:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-290041760',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-290041760',id=8,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-01T14:06:15Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='67079b4774294271895bbf7b04f602e7',ramdisk_id='',reservation_id='r-nzqp9uwt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-2075848047',owner_user_name='tempest-TestExecuteActionsViaActuator-2075848047-project-admin'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-10-01T14:07:03Z,user_data=None,user_id='82619989ef1f48a39f1c1e7d64e4cb38',uuid=cc19e5cf-bf34-4a91-a2d7-519421be8b85,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9eb8c749-13c4-43a1-8edd-b95f31dba7be", "address": "fa:16:3e:ae:eb:01", "network": {"id": "e35f096a-fd75-4d70-ae58-8a76ae666b9d", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1299231587-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1299231587-network", "vif_mac": "fa:16:3e:ae:eb:01"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b14b3910fae84828afa468e1e645402b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9eb8c749-13", "ovs_interfaceid": "9eb8c749-13c4-43a1-8edd-b95f31dba7be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 01 14:07:10 compute-0 nova_compute[192698]: 2025-10-01 14:07:10.272 2 DEBUG nova.network.os_vif_util [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Converting VIF {"id": "9eb8c749-13c4-43a1-8edd-b95f31dba7be", "address": "fa:16:3e:ae:eb:01", "network": {"id": "e35f096a-fd75-4d70-ae58-8a76ae666b9d", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1299231587-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1299231587-network", "vif_mac": "fa:16:3e:ae:eb:01"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b14b3910fae84828afa468e1e645402b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9eb8c749-13", "ovs_interfaceid": "9eb8c749-13c4-43a1-8edd-b95f31dba7be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:07:10 compute-0 nova_compute[192698]: 2025-10-01 14:07:10.272 2 DEBUG nova.network.os_vif_util [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ae:eb:01,bridge_name='br-int',has_traffic_filtering=True,id=9eb8c749-13c4-43a1-8edd-b95f31dba7be,network=Network(e35f096a-fd75-4d70-ae58-8a76ae666b9d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9eb8c749-13') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:07:10 compute-0 nova_compute[192698]: 2025-10-01 14:07:10.273 2 DEBUG os_vif [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ae:eb:01,bridge_name='br-int',has_traffic_filtering=True,id=9eb8c749-13c4-43a1-8edd-b95f31dba7be,network=Network(e35f096a-fd75-4d70-ae58-8a76ae666b9d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9eb8c749-13') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 01 14:07:10 compute-0 nova_compute[192698]: 2025-10-01 14:07:10.275 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:10 compute-0 nova_compute[192698]: 2025-10-01 14:07:10.275 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:07:10 compute-0 nova_compute[192698]: 2025-10-01 14:07:10.276 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:07:10 compute-0 nova_compute[192698]: 2025-10-01 14:07:10.278 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:10 compute-0 nova_compute[192698]: 2025-10-01 14:07:10.278 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'edb89fed-4aaa-5d3e-b83c-202cd68606a7', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:07:10 compute-0 nova_compute[192698]: 2025-10-01 14:07:10.280 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:10 compute-0 nova_compute[192698]: 2025-10-01 14:07:10.283 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:10 compute-0 nova_compute[192698]: 2025-10-01 14:07:10.287 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:10 compute-0 nova_compute[192698]: 2025-10-01 14:07:10.288 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9eb8c749-13, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:07:10 compute-0 nova_compute[192698]: 2025-10-01 14:07:10.289 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap9eb8c749-13, col_values=(('qos', UUID('1ee268b0-44c1-4fee-b6d9-68906040e4ca')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:07:10 compute-0 nova_compute[192698]: 2025-10-01 14:07:10.289 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap9eb8c749-13, col_values=(('external_ids', {'iface-id': '9eb8c749-13c4-43a1-8edd-b95f31dba7be', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ae:eb:01', 'vm-uuid': 'cc19e5cf-bf34-4a91-a2d7-519421be8b85'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:07:10 compute-0 nova_compute[192698]: 2025-10-01 14:07:10.292 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:10 compute-0 NetworkManager[51741]: <info>  [1759327630.2933] manager: (tap9eb8c749-13): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/36)
Oct 01 14:07:10 compute-0 nova_compute[192698]: 2025-10-01 14:07:10.296 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:07:10 compute-0 nova_compute[192698]: 2025-10-01 14:07:10.302 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:10 compute-0 nova_compute[192698]: 2025-10-01 14:07:10.305 2 INFO os_vif [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ae:eb:01,bridge_name='br-int',has_traffic_filtering=True,id=9eb8c749-13c4-43a1-8edd-b95f31dba7be,network=Network(e35f096a-fd75-4d70-ae58-8a76ae666b9d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9eb8c749-13')
Oct 01 14:07:10 compute-0 nova_compute[192698]: 2025-10-01 14:07:10.531 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/28407011-1056-4714-96fc-1e8904bbcf1f/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:07:10 compute-0 nova_compute[192698]: 2025-10-01 14:07:10.621 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/28407011-1056-4714-96fc-1e8904bbcf1f/disk --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:07:10 compute-0 nova_compute[192698]: 2025-10-01 14:07:10.623 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/28407011-1056-4714-96fc-1e8904bbcf1f/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:07:10 compute-0 nova_compute[192698]: 2025-10-01 14:07:10.692 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/28407011-1056-4714-96fc-1e8904bbcf1f/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:07:10 compute-0 nova_compute[192698]: 2025-10-01 14:07:10.701 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ff5702e3-c6c5-4b82-a9c4-6a06747a4cae/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:07:10 compute-0 nova_compute[192698]: 2025-10-01 14:07:10.763 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ff5702e3-c6c5-4b82-a9c4-6a06747a4cae/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:07:10 compute-0 nova_compute[192698]: 2025-10-01 14:07:10.764 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ff5702e3-c6c5-4b82-a9c4-6a06747a4cae/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:07:10 compute-0 nova_compute[192698]: 2025-10-01 14:07:10.819 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ff5702e3-c6c5-4b82-a9c4-6a06747a4cae/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:07:10 compute-0 nova_compute[192698]: 2025-10-01 14:07:10.827 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ad84d705-7d86-4faf-a1d4-b099e7b6a80f/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:07:10 compute-0 nova_compute[192698]: 2025-10-01 14:07:10.887 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ad84d705-7d86-4faf-a1d4-b099e7b6a80f/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:07:10 compute-0 nova_compute[192698]: 2025-10-01 14:07:10.888 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ad84d705-7d86-4faf-a1d4-b099e7b6a80f/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:07:10 compute-0 nova_compute[192698]: 2025-10-01 14:07:10.948 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ad84d705-7d86-4faf-a1d4-b099e7b6a80f/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:07:10 compute-0 nova_compute[192698]: 2025-10-01 14:07:10.955 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c5470fba-81f4-4592-8b40-1027a4dc1c83/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:07:11 compute-0 nova_compute[192698]: 2025-10-01 14:07:11.020 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c5470fba-81f4-4592-8b40-1027a4dc1c83/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:07:11 compute-0 nova_compute[192698]: 2025-10-01 14:07:11.021 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c5470fba-81f4-4592-8b40-1027a4dc1c83/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:07:11 compute-0 nova_compute[192698]: 2025-10-01 14:07:11.114 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c5470fba-81f4-4592-8b40-1027a4dc1c83/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:07:11 compute-0 nova_compute[192698]: 2025-10-01 14:07:11.416 2 WARNING nova.virt.libvirt.driver [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:07:11 compute-0 nova_compute[192698]: 2025-10-01 14:07:11.419 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:07:11 compute-0 nova_compute[192698]: 2025-10-01 14:07:11.453 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.034s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:07:11 compute-0 nova_compute[192698]: 2025-10-01 14:07:11.455 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5138MB free_disk=73.16305923461914GB free_vcpus=4 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 01 14:07:11 compute-0 nova_compute[192698]: 2025-10-01 14:07:11.455 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:07:11 compute-0 nova_compute[192698]: 2025-10-01 14:07:11.456 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:07:11 compute-0 nova_compute[192698]: 2025-10-01 14:07:11.865 2 DEBUG nova.virt.libvirt.driver [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 01 14:07:11 compute-0 nova_compute[192698]: 2025-10-01 14:07:11.865 2 DEBUG nova.virt.libvirt.driver [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 01 14:07:11 compute-0 nova_compute[192698]: 2025-10-01 14:07:11.866 2 DEBUG nova.virt.libvirt.driver [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] No VIF found with MAC fa:16:3e:ae:eb:01, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Oct 01 14:07:11 compute-0 nova_compute[192698]: 2025-10-01 14:07:11.867 2 INFO nova.virt.libvirt.driver [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: cc19e5cf-bf34-4a91-a2d7-519421be8b85] Using config drive
Oct 01 14:07:11 compute-0 NetworkManager[51741]: <info>  [1759327631.9473] manager: (tap9eb8c749-13): new Tun device (/org/freedesktop/NetworkManager/Devices/37)
Oct 01 14:07:11 compute-0 kernel: tap9eb8c749-13: entered promiscuous mode
Oct 01 14:07:11 compute-0 ovn_controller[94909]: 2025-10-01T14:07:11Z|00068|binding|INFO|Claiming lport 9eb8c749-13c4-43a1-8edd-b95f31dba7be for this chassis.
Oct 01 14:07:11 compute-0 nova_compute[192698]: 2025-10-01 14:07:11.957 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:11 compute-0 ovn_controller[94909]: 2025-10-01T14:07:11Z|00069|binding|INFO|9eb8c749-13c4-43a1-8edd-b95f31dba7be: Claiming fa:16:3e:ae:eb:01 10.100.0.11
Oct 01 14:07:11 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:11.966 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ae:eb:01 10.100.0.11'], port_security=['fa:16:3e:ae:eb:01 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'cc19e5cf-bf34-4a91-a2d7-519421be8b85', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e35f096a-fd75-4d70-ae58-8a76ae666b9d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '67079b4774294271895bbf7b04f602e7', 'neutron:revision_number': '9', 'neutron:security_group_ids': 'de7872a1-1f76-4b0f-8bd9-119520ff7a88', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e3a455d-1f77-441e-b08a-0ec8231910e5, chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], logical_port=9eb8c749-13c4-43a1-8edd-b95f31dba7be) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:07:11 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:11.967 103791 INFO neutron.agent.ovn.metadata.agent [-] Port 9eb8c749-13c4-43a1-8edd-b95f31dba7be in datapath e35f096a-fd75-4d70-ae58-8a76ae666b9d bound to our chassis
Oct 01 14:07:11 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:11.970 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e35f096a-fd75-4d70-ae58-8a76ae666b9d
Oct 01 14:07:11 compute-0 ovn_controller[94909]: 2025-10-01T14:07:11Z|00070|binding|INFO|Setting lport 9eb8c749-13c4-43a1-8edd-b95f31dba7be ovn-installed in OVS
Oct 01 14:07:11 compute-0 ovn_controller[94909]: 2025-10-01T14:07:11Z|00071|binding|INFO|Setting lport 9eb8c749-13c4-43a1-8edd-b95f31dba7be up in Southbound
Oct 01 14:07:11 compute-0 nova_compute[192698]: 2025-10-01 14:07:11.991 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:11 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:11.991 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[c5dad12d-f33e-4846-9b08-23c95245514a]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:07:12 compute-0 systemd-udevd[217541]: Network interface NamePolicy= disabled on kernel command line.
Oct 01 14:07:12 compute-0 systemd-machined[152704]: New machine qemu-6-instance-00000008.
Oct 01 14:07:12 compute-0 NetworkManager[51741]: <info>  [1759327632.0172] device (tap9eb8c749-13): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 01 14:07:12 compute-0 NetworkManager[51741]: <info>  [1759327632.0189] device (tap9eb8c749-13): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 01 14:07:12 compute-0 systemd[1]: Started Virtual Machine qemu-6-instance-00000008.
Oct 01 14:07:12 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:12.033 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[71f92ec7-8903-444f-a0d8-7d6b2c5dad15]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:07:12 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:12.036 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[ca9d7748-18e5-4a39-9ade-e97cc3c7be75]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:07:12 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:12.078 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[46daf038-5dde-436e-84d4-c425aede3700]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:07:12 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:12.101 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[5fdcfb55-6ea8-4809-bfdf-970a8cd511f1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape35f096a-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:47:1b:50'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 19, 'tx_packets': 12, 'rx_bytes': 1294, 'tx_bytes': 696, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 19, 'tx_packets': 12, 'rx_bytes': 1294, 'tx_bytes': 696, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 382912, 'reachable_time': 20041, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217552, 'error': None, 'target': 'ovnmeta-e35f096a-fd75-4d70-ae58-8a76ae666b9d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:07:12 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:12.125 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[335b4315-73e9-4233-9ffc-27ead8b6d008]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tape35f096a-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 382931, 'tstamp': 382931}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217555, 'error': None, 'target': 'ovnmeta-e35f096a-fd75-4d70-ae58-8a76ae666b9d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tape35f096a-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 382936, 'tstamp': 382936}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217555, 'error': None, 'target': 'ovnmeta-e35f096a-fd75-4d70-ae58-8a76ae666b9d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:07:12 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:12.127 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape35f096a-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:07:12 compute-0 nova_compute[192698]: 2025-10-01 14:07:12.129 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:12 compute-0 nova_compute[192698]: 2025-10-01 14:07:12.130 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:12 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:12.131 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape35f096a-f0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:07:12 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:12.131 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:07:12 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:12.132 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape35f096a-f0, col_values=(('external_ids', {'iface-id': '3f9111f1-79b1-4bf1-bb95-d924c71fb42c'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:07:12 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:12.132 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:07:12 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:12.133 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[76b4d128-9cb6-4f92-8ddd-07ac84fb45e5]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-e35f096a-fd75-4d70-ae58-8a76ae666b9d\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/e35f096a-fd75-4d70-ae58-8a76ae666b9d.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID e35f096a-fd75-4d70-ae58-8a76ae666b9d\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:07:12 compute-0 nova_compute[192698]: 2025-10-01 14:07:12.347 2 INFO nova.compute.manager [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: c5470fba-81f4-4592-8b40-1027a4dc1c83] Post operation of migration started
Oct 01 14:07:12 compute-0 nova_compute[192698]: 2025-10-01 14:07:12.347 2 WARNING neutronclient.v2_0.client [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:07:12 compute-0 nova_compute[192698]: 2025-10-01 14:07:12.482 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Applying migration context for instance cc19e5cf-bf34-4a91-a2d7-519421be8b85 as it has an incoming, in-progress migration 8b717a78-b855-476a-815f-00c4f6e57d6f. Migration status is post-migrating _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1046
Oct 01 14:07:12 compute-0 nova_compute[192698]: 2025-10-01 14:07:12.485 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Migration for instance c5470fba-81f4-4592-8b40-1027a4dc1c83 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Oct 01 14:07:12 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Oct 01 14:07:12 compute-0 systemd[217295]: Activating special unit Exit the Session...
Oct 01 14:07:12 compute-0 systemd[217295]: Stopped target Main User Target.
Oct 01 14:07:12 compute-0 systemd[217295]: Stopped target Basic System.
Oct 01 14:07:12 compute-0 systemd[217295]: Stopped target Paths.
Oct 01 14:07:12 compute-0 systemd[217295]: Stopped target Sockets.
Oct 01 14:07:12 compute-0 systemd[217295]: Stopped target Timers.
Oct 01 14:07:12 compute-0 systemd[217295]: Stopped Mark boot as successful after the user session has run 2 minutes.
Oct 01 14:07:12 compute-0 systemd[217295]: Stopped Daily Cleanup of User's Temporary Directories.
Oct 01 14:07:12 compute-0 systemd[217295]: Closed D-Bus User Message Bus Socket.
Oct 01 14:07:12 compute-0 systemd[217295]: Stopped Create User's Volatile Files and Directories.
Oct 01 14:07:12 compute-0 systemd[217295]: Removed slice User Application Slice.
Oct 01 14:07:12 compute-0 systemd[217295]: Reached target Shutdown.
Oct 01 14:07:12 compute-0 systemd[217295]: Finished Exit the Session.
Oct 01 14:07:12 compute-0 systemd[217295]: Reached target Exit the Session.
Oct 01 14:07:12 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Oct 01 14:07:12 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Oct 01 14:07:12 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Oct 01 14:07:12 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Oct 01 14:07:12 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Oct 01 14:07:12 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Oct 01 14:07:12 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Oct 01 14:07:12 compute-0 nova_compute[192698]: 2025-10-01 14:07:12.833 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:12 compute-0 nova_compute[192698]: 2025-10-01 14:07:12.840 2 DEBUG nova.compute.manager [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: cc19e5cf-bf34-4a91-a2d7-519421be8b85] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Oct 01 14:07:12 compute-0 nova_compute[192698]: 2025-10-01 14:07:12.845 2 INFO nova.virt.libvirt.driver [-] [instance: cc19e5cf-bf34-4a91-a2d7-519421be8b85] Instance running successfully.
Oct 01 14:07:12 compute-0 virtqemud[192597]: argument unsupported: QEMU guest agent is not configured
Oct 01 14:07:12 compute-0 nova_compute[192698]: 2025-10-01 14:07:12.850 2 DEBUG nova.virt.libvirt.guest [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: cc19e5cf-bf34-4a91-a2d7-519421be8b85] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:200
Oct 01 14:07:12 compute-0 nova_compute[192698]: 2025-10-01 14:07:12.851 2 DEBUG nova.virt.libvirt.driver [None req-06a261b1-2944-495d-952d-283740649c58 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: cc19e5cf-bf34-4a91-a2d7-519421be8b85] finish_migration finished successfully. finish_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12699
Oct 01 14:07:12 compute-0 nova_compute[192698]: 2025-10-01 14:07:12.993 2 INFO nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] [instance: c5470fba-81f4-4592-8b40-1027a4dc1c83] Updating resource usage from migration 8124fa88-0e6f-46a8-873f-043d3ab2268a
Oct 01 14:07:12 compute-0 nova_compute[192698]: 2025-10-01 14:07:12.994 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] [instance: c5470fba-81f4-4592-8b40-1027a4dc1c83] Starting to track incoming migration 8124fa88-0e6f-46a8-873f-043d3ab2268a with flavor 69702c4b-38f2-49d1-96d5-85671652c67e _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Oct 01 14:07:13 compute-0 nova_compute[192698]: 2025-10-01 14:07:13.028 2 WARNING neutronclient.v2_0.client [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:07:13 compute-0 nova_compute[192698]: 2025-10-01 14:07:13.028 2 WARNING neutronclient.v2_0.client [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:07:13 compute-0 nova_compute[192698]: 2025-10-01 14:07:13.502 2 INFO nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] [instance: cc19e5cf-bf34-4a91-a2d7-519421be8b85] Updating resource usage from migration 8b717a78-b855-476a-815f-00c4f6e57d6f
Oct 01 14:07:13 compute-0 nova_compute[192698]: 2025-10-01 14:07:13.542 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Instance 28407011-1056-4714-96fc-1e8904bbcf1f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Oct 01 14:07:13 compute-0 nova_compute[192698]: 2025-10-01 14:07:13.543 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Instance ff5702e3-c6c5-4b82-a9c4-6a06747a4cae actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Oct 01 14:07:13 compute-0 nova_compute[192698]: 2025-10-01 14:07:13.543 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Instance ad84d705-7d86-4faf-a1d4-b099e7b6a80f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Oct 01 14:07:14 compute-0 nova_compute[192698]: 2025-10-01 14:07:14.049 2 DEBUG oslo_concurrency.lockutils [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "refresh_cache-c5470fba-81f4-4592-8b40-1027a4dc1c83" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 01 14:07:14 compute-0 nova_compute[192698]: 2025-10-01 14:07:14.049 2 DEBUG oslo_concurrency.lockutils [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquired lock "refresh_cache-c5470fba-81f4-4592-8b40-1027a4dc1c83" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 01 14:07:14 compute-0 nova_compute[192698]: 2025-10-01 14:07:14.049 2 DEBUG nova.network.neutron [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: c5470fba-81f4-4592-8b40-1027a4dc1c83] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 01 14:07:14 compute-0 nova_compute[192698]: 2025-10-01 14:07:14.050 2 WARNING nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Instance c5470fba-81f4-4592-8b40-1027a4dc1c83 has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Oct 01 14:07:14 compute-0 nova_compute[192698]: 2025-10-01 14:07:14.050 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Instance cc19e5cf-bf34-4a91-a2d7-519421be8b85 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Oct 01 14:07:14 compute-0 nova_compute[192698]: 2025-10-01 14:07:14.050 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Total usable vcpus: 8, total allocated vcpus: 5 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 01 14:07:14 compute-0 nova_compute[192698]: 2025-10-01 14:07:14.051 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=1152MB phys_disk=79GB used_disk=5GB total_vcpus=8 used_vcpus=5 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:07:11 up  1:06,  0 user,  load average: 0.52, 0.39, 0.50\n', 'num_instances': '4', 'num_vm_active': '4', 'num_task_None': '3', 'num_os_type_None': '4', 'num_proj_67079b4774294271895bbf7b04f602e7': '4', 'io_workload': '0', 'num_task_resize_finish': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 01 14:07:14 compute-0 nova_compute[192698]: 2025-10-01 14:07:14.137 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:07:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:14.239 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:07:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:14.240 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:07:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:14.241 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:07:14 compute-0 nova_compute[192698]: 2025-10-01 14:07:14.556 2 WARNING neutronclient.v2_0.client [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:07:14 compute-0 nova_compute[192698]: 2025-10-01 14:07:14.644 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:07:15 compute-0 nova_compute[192698]: 2025-10-01 14:07:15.157 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 01 14:07:15 compute-0 nova_compute[192698]: 2025-10-01 14:07:15.158 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.702s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:07:15 compute-0 nova_compute[192698]: 2025-10-01 14:07:15.291 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:15 compute-0 nova_compute[192698]: 2025-10-01 14:07:15.392 2 DEBUG nova.compute.manager [req-0169656b-26fb-4422-b2fd-60aece7110e6 req-08a1a4d0-c15c-4099-9f4b-8831ec50a77a 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: cc19e5cf-bf34-4a91-a2d7-519421be8b85] Received event network-vif-plugged-9eb8c749-13c4-43a1-8edd-b95f31dba7be external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:07:15 compute-0 nova_compute[192698]: 2025-10-01 14:07:15.392 2 DEBUG oslo_concurrency.lockutils [req-0169656b-26fb-4422-b2fd-60aece7110e6 req-08a1a4d0-c15c-4099-9f4b-8831ec50a77a 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "cc19e5cf-bf34-4a91-a2d7-519421be8b85-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:07:15 compute-0 nova_compute[192698]: 2025-10-01 14:07:15.393 2 DEBUG oslo_concurrency.lockutils [req-0169656b-26fb-4422-b2fd-60aece7110e6 req-08a1a4d0-c15c-4099-9f4b-8831ec50a77a 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "cc19e5cf-bf34-4a91-a2d7-519421be8b85-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:07:15 compute-0 nova_compute[192698]: 2025-10-01 14:07:15.393 2 DEBUG oslo_concurrency.lockutils [req-0169656b-26fb-4422-b2fd-60aece7110e6 req-08a1a4d0-c15c-4099-9f4b-8831ec50a77a 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "cc19e5cf-bf34-4a91-a2d7-519421be8b85-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:07:15 compute-0 nova_compute[192698]: 2025-10-01 14:07:15.393 2 DEBUG nova.compute.manager [req-0169656b-26fb-4422-b2fd-60aece7110e6 req-08a1a4d0-c15c-4099-9f4b-8831ec50a77a 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: cc19e5cf-bf34-4a91-a2d7-519421be8b85] No waiting events found dispatching network-vif-plugged-9eb8c749-13c4-43a1-8edd-b95f31dba7be pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:07:15 compute-0 nova_compute[192698]: 2025-10-01 14:07:15.394 2 WARNING nova.compute.manager [req-0169656b-26fb-4422-b2fd-60aece7110e6 req-08a1a4d0-c15c-4099-9f4b-8831ec50a77a 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: cc19e5cf-bf34-4a91-a2d7-519421be8b85] Received unexpected event network-vif-plugged-9eb8c749-13c4-43a1-8edd-b95f31dba7be for instance with vm_state resized and task_state None.
Oct 01 14:07:15 compute-0 nova_compute[192698]: 2025-10-01 14:07:15.539 2 WARNING neutronclient.v2_0.client [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:07:15 compute-0 nova_compute[192698]: 2025-10-01 14:07:15.690 2 DEBUG nova.network.neutron [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: c5470fba-81f4-4592-8b40-1027a4dc1c83] Updating instance_info_cache with network_info: [{"id": "bae9bb47-22fa-49ee-9b7e-fc3a13b33880", "address": "fa:16:3e:1d:07:29", "network": {"id": "e35f096a-fd75-4d70-ae58-8a76ae666b9d", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1299231587-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b14b3910fae84828afa468e1e645402b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbae9bb47-22", "ovs_interfaceid": "bae9bb47-22fa-49ee-9b7e-fc3a13b33880", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:07:16 compute-0 nova_compute[192698]: 2025-10-01 14:07:16.148 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:07:16 compute-0 nova_compute[192698]: 2025-10-01 14:07:16.149 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:07:16 compute-0 nova_compute[192698]: 2025-10-01 14:07:16.197 2 DEBUG oslo_concurrency.lockutils [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Releasing lock "refresh_cache-c5470fba-81f4-4592-8b40-1027a4dc1c83" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 01 14:07:16 compute-0 podman[217568]: 2025-10-01 14:07:16.259741475 +0000 UTC m=+0.148858253 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Oct 01 14:07:16 compute-0 podman[217569]: 2025-10-01 14:07:16.32339022 +0000 UTC m=+0.212251711 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible)
Oct 01 14:07:16 compute-0 nova_compute[192698]: 2025-10-01 14:07:16.675 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:07:16 compute-0 nova_compute[192698]: 2025-10-01 14:07:16.675 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:07:16 compute-0 nova_compute[192698]: 2025-10-01 14:07:16.676 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:07:16 compute-0 nova_compute[192698]: 2025-10-01 14:07:16.676 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:07:16 compute-0 nova_compute[192698]: 2025-10-01 14:07:16.676 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 01 14:07:16 compute-0 nova_compute[192698]: 2025-10-01 14:07:16.726 2 DEBUG oslo_concurrency.lockutils [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:07:16 compute-0 nova_compute[192698]: 2025-10-01 14:07:16.727 2 DEBUG oslo_concurrency.lockutils [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:07:16 compute-0 nova_compute[192698]: 2025-10-01 14:07:16.727 2 DEBUG oslo_concurrency.lockutils [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:07:16 compute-0 nova_compute[192698]: 2025-10-01 14:07:16.733 2 INFO nova.virt.libvirt.driver [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: c5470fba-81f4-4592-8b40-1027a4dc1c83] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Oct 01 14:07:16 compute-0 virtqemud[192597]: Domain id=5 name='instance-00000006' uuid=c5470fba-81f4-4592-8b40-1027a4dc1c83 is tainted: custom-monitor
Oct 01 14:07:17 compute-0 nova_compute[192698]: 2025-10-01 14:07:17.492 2 DEBUG nova.compute.manager [req-e93a4719-434c-4865-b250-92bd227ce2bf req-b4d5eaac-3c84-4019-8e0d-4d89bf366ee3 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: cc19e5cf-bf34-4a91-a2d7-519421be8b85] Received event network-vif-plugged-9eb8c749-13c4-43a1-8edd-b95f31dba7be external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:07:17 compute-0 nova_compute[192698]: 2025-10-01 14:07:17.492 2 DEBUG oslo_concurrency.lockutils [req-e93a4719-434c-4865-b250-92bd227ce2bf req-b4d5eaac-3c84-4019-8e0d-4d89bf366ee3 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "cc19e5cf-bf34-4a91-a2d7-519421be8b85-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:07:17 compute-0 nova_compute[192698]: 2025-10-01 14:07:17.493 2 DEBUG oslo_concurrency.lockutils [req-e93a4719-434c-4865-b250-92bd227ce2bf req-b4d5eaac-3c84-4019-8e0d-4d89bf366ee3 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "cc19e5cf-bf34-4a91-a2d7-519421be8b85-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:07:17 compute-0 nova_compute[192698]: 2025-10-01 14:07:17.493 2 DEBUG oslo_concurrency.lockutils [req-e93a4719-434c-4865-b250-92bd227ce2bf req-b4d5eaac-3c84-4019-8e0d-4d89bf366ee3 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "cc19e5cf-bf34-4a91-a2d7-519421be8b85-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:07:17 compute-0 nova_compute[192698]: 2025-10-01 14:07:17.493 2 DEBUG nova.compute.manager [req-e93a4719-434c-4865-b250-92bd227ce2bf req-b4d5eaac-3c84-4019-8e0d-4d89bf366ee3 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: cc19e5cf-bf34-4a91-a2d7-519421be8b85] No waiting events found dispatching network-vif-plugged-9eb8c749-13c4-43a1-8edd-b95f31dba7be pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:07:17 compute-0 nova_compute[192698]: 2025-10-01 14:07:17.494 2 WARNING nova.compute.manager [req-e93a4719-434c-4865-b250-92bd227ce2bf req-b4d5eaac-3c84-4019-8e0d-4d89bf366ee3 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: cc19e5cf-bf34-4a91-a2d7-519421be8b85] Received unexpected event network-vif-plugged-9eb8c749-13c4-43a1-8edd-b95f31dba7be for instance with vm_state resized and task_state None.
Oct 01 14:07:17 compute-0 nova_compute[192698]: 2025-10-01 14:07:17.744 2 INFO nova.virt.libvirt.driver [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: c5470fba-81f4-4592-8b40-1027a4dc1c83] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Oct 01 14:07:17 compute-0 nova_compute[192698]: 2025-10-01 14:07:17.837 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:18 compute-0 nova_compute[192698]: 2025-10-01 14:07:18.752 2 INFO nova.virt.libvirt.driver [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: c5470fba-81f4-4592-8b40-1027a4dc1c83] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Oct 01 14:07:18 compute-0 nova_compute[192698]: 2025-10-01 14:07:18.758 2 DEBUG nova.compute.manager [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: c5470fba-81f4-4592-8b40-1027a4dc1c83] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 01 14:07:19 compute-0 nova_compute[192698]: 2025-10-01 14:07:19.271 2 DEBUG nova.objects.instance [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: c5470fba-81f4-4592-8b40-1027a4dc1c83] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Oct 01 14:07:20 compute-0 nova_compute[192698]: 2025-10-01 14:07:20.292 2 WARNING neutronclient.v2_0.client [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:07:20 compute-0 nova_compute[192698]: 2025-10-01 14:07:20.295 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:21 compute-0 nova_compute[192698]: 2025-10-01 14:07:21.041 2 WARNING neutronclient.v2_0.client [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:07:21 compute-0 nova_compute[192698]: 2025-10-01 14:07:21.042 2 WARNING neutronclient.v2_0.client [None req-d8e123e5-191c-4299-bf77-dec0d80bb271 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:07:22 compute-0 nova_compute[192698]: 2025-10-01 14:07:22.841 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:24 compute-0 podman[217615]: 2025-10-01 14:07:24.190814929 +0000 UTC m=+0.092070277 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, distribution-scope=public, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=)
Oct 01 14:07:25 compute-0 nova_compute[192698]: 2025-10-01 14:07:25.297 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:27 compute-0 nova_compute[192698]: 2025-10-01 14:07:27.846 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:28 compute-0 ovn_controller[94909]: 2025-10-01T14:07:28Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ae:eb:01 10.100.0.11
Oct 01 14:07:29 compute-0 podman[203144]: time="2025-10-01T14:07:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:07:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:07:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20750 "" "Go-http-client/1.1"
Oct 01 14:07:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:07:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3481 "" "Go-http-client/1.1"
Oct 01 14:07:30 compute-0 podman[217644]: 2025-10-01 14:07:30.183448744 +0000 UTC m=+0.089754026 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS)
Oct 01 14:07:30 compute-0 nova_compute[192698]: 2025-10-01 14:07:30.218 2 DEBUG oslo_concurrency.lockutils [None req-670900b5-6f8d-4b86-8279-ff6dc36d9146 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Acquiring lock "ad84d705-7d86-4faf-a1d4-b099e7b6a80f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:07:30 compute-0 nova_compute[192698]: 2025-10-01 14:07:30.219 2 DEBUG oslo_concurrency.lockutils [None req-670900b5-6f8d-4b86-8279-ff6dc36d9146 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "ad84d705-7d86-4faf-a1d4-b099e7b6a80f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:07:30 compute-0 nova_compute[192698]: 2025-10-01 14:07:30.219 2 DEBUG oslo_concurrency.lockutils [None req-670900b5-6f8d-4b86-8279-ff6dc36d9146 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Acquiring lock "ad84d705-7d86-4faf-a1d4-b099e7b6a80f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:07:30 compute-0 nova_compute[192698]: 2025-10-01 14:07:30.219 2 DEBUG oslo_concurrency.lockutils [None req-670900b5-6f8d-4b86-8279-ff6dc36d9146 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "ad84d705-7d86-4faf-a1d4-b099e7b6a80f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:07:30 compute-0 nova_compute[192698]: 2025-10-01 14:07:30.220 2 DEBUG oslo_concurrency.lockutils [None req-670900b5-6f8d-4b86-8279-ff6dc36d9146 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "ad84d705-7d86-4faf-a1d4-b099e7b6a80f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:07:30 compute-0 podman[217645]: 2025-10-01 14:07:30.231674861 +0000 UTC m=+0.121507509 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_id=multipathd, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Oct 01 14:07:30 compute-0 nova_compute[192698]: 2025-10-01 14:07:30.232 2 INFO nova.compute.manager [None req-670900b5-6f8d-4b86-8279-ff6dc36d9146 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] Terminating instance
Oct 01 14:07:30 compute-0 nova_compute[192698]: 2025-10-01 14:07:30.337 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:30 compute-0 nova_compute[192698]: 2025-10-01 14:07:30.748 2 DEBUG nova.compute.manager [None req-670900b5-6f8d-4b86-8279-ff6dc36d9146 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 01 14:07:30 compute-0 kernel: tapc3ad6fe1-af (unregistering): left promiscuous mode
Oct 01 14:07:30 compute-0 NetworkManager[51741]: <info>  [1759327650.7928] device (tapc3ad6fe1-af): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 01 14:07:30 compute-0 nova_compute[192698]: 2025-10-01 14:07:30.807 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:30 compute-0 ovn_controller[94909]: 2025-10-01T14:07:30Z|00072|binding|INFO|Releasing lport c3ad6fe1-af17-4f0b-86c4-8d4384248932 from this chassis (sb_readonly=0)
Oct 01 14:07:30 compute-0 ovn_controller[94909]: 2025-10-01T14:07:30Z|00073|binding|INFO|Setting lport c3ad6fe1-af17-4f0b-86c4-8d4384248932 down in Southbound
Oct 01 14:07:30 compute-0 ovn_controller[94909]: 2025-10-01T14:07:30Z|00074|binding|INFO|Removing iface tapc3ad6fe1-af ovn-installed in OVS
Oct 01 14:07:30 compute-0 nova_compute[192698]: 2025-10-01 14:07:30.813 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:30 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:30.819 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:15:67:38 10.100.0.6'], port_security=['fa:16:3e:15:67:38 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'ad84d705-7d86-4faf-a1d4-b099e7b6a80f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e35f096a-fd75-4d70-ae58-8a76ae666b9d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '67079b4774294271895bbf7b04f602e7', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'de7872a1-1f76-4b0f-8bd9-119520ff7a88', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e3a455d-1f77-441e-b08a-0ec8231910e5, chassis=[], tunnel_key=8, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], logical_port=c3ad6fe1-af17-4f0b-86c4-8d4384248932) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:07:30 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:30.820 103791 INFO neutron.agent.ovn.metadata.agent [-] Port c3ad6fe1-af17-4f0b-86c4-8d4384248932 in datapath e35f096a-fd75-4d70-ae58-8a76ae666b9d unbound from our chassis
Oct 01 14:07:30 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:30.822 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e35f096a-fd75-4d70-ae58-8a76ae666b9d
Oct 01 14:07:30 compute-0 nova_compute[192698]: 2025-10-01 14:07:30.839 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:30 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:30.854 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[e93b87a4-6ec3-4807-a6b3-0f6234aa8232]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:07:30 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000009.scope: Deactivated successfully.
Oct 01 14:07:30 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000009.scope: Consumed 14.822s CPU time.
Oct 01 14:07:30 compute-0 systemd-machined[152704]: Machine qemu-4-instance-00000009 terminated.
Oct 01 14:07:30 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:30.905 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[e3cc0eea-f126-4771-b175-370f841be667]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:07:30 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:30.908 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[1a63817b-4316-4bc1-bfe4-a4f903d17f60]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:07:30 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:30.946 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[12118a81-d999-4a2b-9f07-7e1d8416115e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:07:30 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:30.978 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[de399c6c-b838-4d22-a62a-e16f532443c9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape35f096a-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:47:1b:50'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 34, 'tx_packets': 14, 'rx_bytes': 1924, 'tx_bytes': 780, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 34, 'tx_packets': 14, 'rx_bytes': 1924, 'tx_bytes': 780, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 382912, 'reachable_time': 20041, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217698, 'error': None, 'target': 'ovnmeta-e35f096a-fd75-4d70-ae58-8a76ae666b9d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:07:31 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:31.012 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[f43d9558-493e-4b95-89fb-1c642af33393]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tape35f096a-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 382931, 'tstamp': 382931}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217705, 'error': None, 'target': 'ovnmeta-e35f096a-fd75-4d70-ae58-8a76ae666b9d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tape35f096a-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 382936, 'tstamp': 382936}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217705, 'error': None, 'target': 'ovnmeta-e35f096a-fd75-4d70-ae58-8a76ae666b9d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:07:31 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:31.015 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape35f096a-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:07:31 compute-0 nova_compute[192698]: 2025-10-01 14:07:31.018 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:31 compute-0 nova_compute[192698]: 2025-10-01 14:07:31.025 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:31 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:31.026 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape35f096a-f0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:07:31 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:31.026 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:07:31 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:31.027 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape35f096a-f0, col_values=(('external_ids', {'iface-id': '3f9111f1-79b1-4bf1-bb95-d924c71fb42c'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:07:31 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:31.027 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:07:31 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:31.029 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[ad2c9ff3-bcab-4e52-aff8-8333a0f2c9eb]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-e35f096a-fd75-4d70-ae58-8a76ae666b9d\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/e35f096a-fd75-4d70-ae58-8a76ae666b9d.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID e35f096a-fd75-4d70-ae58-8a76ae666b9d\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:07:31 compute-0 nova_compute[192698]: 2025-10-01 14:07:31.038 2 INFO nova.virt.libvirt.driver [-] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] Instance destroyed successfully.
Oct 01 14:07:31 compute-0 nova_compute[192698]: 2025-10-01 14:07:31.038 2 DEBUG nova.objects.instance [None req-670900b5-6f8d-4b86-8279-ff6dc36d9146 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lazy-loading 'resources' on Instance uuid ad84d705-7d86-4faf-a1d4-b099e7b6a80f obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:07:31 compute-0 nova_compute[192698]: 2025-10-01 14:07:31.136 2 DEBUG nova.compute.manager [req-c41df3ad-e337-4061-94e9-a467defdc76b req-5dcf7007-984f-4b57-a925-c4ba7a73f20c 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] Received event network-vif-unplugged-c3ad6fe1-af17-4f0b-86c4-8d4384248932 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:07:31 compute-0 nova_compute[192698]: 2025-10-01 14:07:31.136 2 DEBUG oslo_concurrency.lockutils [req-c41df3ad-e337-4061-94e9-a467defdc76b req-5dcf7007-984f-4b57-a925-c4ba7a73f20c 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "ad84d705-7d86-4faf-a1d4-b099e7b6a80f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:07:31 compute-0 nova_compute[192698]: 2025-10-01 14:07:31.136 2 DEBUG oslo_concurrency.lockutils [req-c41df3ad-e337-4061-94e9-a467defdc76b req-5dcf7007-984f-4b57-a925-c4ba7a73f20c 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "ad84d705-7d86-4faf-a1d4-b099e7b6a80f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:07:31 compute-0 nova_compute[192698]: 2025-10-01 14:07:31.137 2 DEBUG oslo_concurrency.lockutils [req-c41df3ad-e337-4061-94e9-a467defdc76b req-5dcf7007-984f-4b57-a925-c4ba7a73f20c 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "ad84d705-7d86-4faf-a1d4-b099e7b6a80f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:07:31 compute-0 nova_compute[192698]: 2025-10-01 14:07:31.137 2 DEBUG nova.compute.manager [req-c41df3ad-e337-4061-94e9-a467defdc76b req-5dcf7007-984f-4b57-a925-c4ba7a73f20c 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] No waiting events found dispatching network-vif-unplugged-c3ad6fe1-af17-4f0b-86c4-8d4384248932 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:07:31 compute-0 nova_compute[192698]: 2025-10-01 14:07:31.137 2 DEBUG nova.compute.manager [req-c41df3ad-e337-4061-94e9-a467defdc76b req-5dcf7007-984f-4b57-a925-c4ba7a73f20c 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] Received event network-vif-unplugged-c3ad6fe1-af17-4f0b-86c4-8d4384248932 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 01 14:07:31 compute-0 openstack_network_exporter[205307]: ERROR   14:07:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:07:31 compute-0 openstack_network_exporter[205307]: ERROR   14:07:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:07:31 compute-0 openstack_network_exporter[205307]: ERROR   14:07:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:07:31 compute-0 openstack_network_exporter[205307]: ERROR   14:07:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:07:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:07:31 compute-0 openstack_network_exporter[205307]: ERROR   14:07:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:07:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:07:31 compute-0 nova_compute[192698]: 2025-10-01 14:07:31.545 2 DEBUG nova.virt.libvirt.vif [None req-670900b5-6f8d-4b86-8279-ff6dc36d9146 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-10-01T14:06:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1504825469',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1504825469',id=9,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-01T14:06:37Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='67079b4774294271895bbf7b04f602e7',ramdisk_id='',reservation_id='r-3x429zj8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-2075848047',owner_user_name='tempest-TestExecuteActionsViaActuator-2075848047-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-01T14:06:38Z,user_data=None,user_id='82619989ef1f48a39f1c1e7d64e4cb38',uuid=ad84d705-7d86-4faf-a1d4-b099e7b6a80f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c3ad6fe1-af17-4f0b-86c4-8d4384248932", "address": "fa:16:3e:15:67:38", "network": {"id": "e35f096a-fd75-4d70-ae58-8a76ae666b9d", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1299231587-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b14b3910fae84828afa468e1e645402b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3ad6fe1-af", "ovs_interfaceid": "c3ad6fe1-af17-4f0b-86c4-8d4384248932", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 01 14:07:31 compute-0 nova_compute[192698]: 2025-10-01 14:07:31.545 2 DEBUG nova.network.os_vif_util [None req-670900b5-6f8d-4b86-8279-ff6dc36d9146 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Converting VIF {"id": "c3ad6fe1-af17-4f0b-86c4-8d4384248932", "address": "fa:16:3e:15:67:38", "network": {"id": "e35f096a-fd75-4d70-ae58-8a76ae666b9d", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1299231587-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b14b3910fae84828afa468e1e645402b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc3ad6fe1-af", "ovs_interfaceid": "c3ad6fe1-af17-4f0b-86c4-8d4384248932", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:07:31 compute-0 nova_compute[192698]: 2025-10-01 14:07:31.546 2 DEBUG nova.network.os_vif_util [None req-670900b5-6f8d-4b86-8279-ff6dc36d9146 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:15:67:38,bridge_name='br-int',has_traffic_filtering=True,id=c3ad6fe1-af17-4f0b-86c4-8d4384248932,network=Network(e35f096a-fd75-4d70-ae58-8a76ae666b9d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3ad6fe1-af') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:07:31 compute-0 nova_compute[192698]: 2025-10-01 14:07:31.546 2 DEBUG os_vif [None req-670900b5-6f8d-4b86-8279-ff6dc36d9146 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:15:67:38,bridge_name='br-int',has_traffic_filtering=True,id=c3ad6fe1-af17-4f0b-86c4-8d4384248932,network=Network(e35f096a-fd75-4d70-ae58-8a76ae666b9d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3ad6fe1-af') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 01 14:07:31 compute-0 nova_compute[192698]: 2025-10-01 14:07:31.548 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:31 compute-0 nova_compute[192698]: 2025-10-01 14:07:31.548 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc3ad6fe1-af, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:07:31 compute-0 nova_compute[192698]: 2025-10-01 14:07:31.550 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:31 compute-0 nova_compute[192698]: 2025-10-01 14:07:31.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:31 compute-0 nova_compute[192698]: 2025-10-01 14:07:31.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:31 compute-0 nova_compute[192698]: 2025-10-01 14:07:31.552 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=0ff30eed-2380-4298-8432-18882c47bb3e) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:07:31 compute-0 nova_compute[192698]: 2025-10-01 14:07:31.553 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:31 compute-0 nova_compute[192698]: 2025-10-01 14:07:31.554 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:31 compute-0 nova_compute[192698]: 2025-10-01 14:07:31.556 2 INFO os_vif [None req-670900b5-6f8d-4b86-8279-ff6dc36d9146 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:15:67:38,bridge_name='br-int',has_traffic_filtering=True,id=c3ad6fe1-af17-4f0b-86c4-8d4384248932,network=Network(e35f096a-fd75-4d70-ae58-8a76ae666b9d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc3ad6fe1-af')
Oct 01 14:07:31 compute-0 nova_compute[192698]: 2025-10-01 14:07:31.556 2 INFO nova.virt.libvirt.driver [None req-670900b5-6f8d-4b86-8279-ff6dc36d9146 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] Deleting instance files /var/lib/nova/instances/ad84d705-7d86-4faf-a1d4-b099e7b6a80f_del
Oct 01 14:07:31 compute-0 nova_compute[192698]: 2025-10-01 14:07:31.557 2 INFO nova.virt.libvirt.driver [None req-670900b5-6f8d-4b86-8279-ff6dc36d9146 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] Deletion of /var/lib/nova/instances/ad84d705-7d86-4faf-a1d4-b099e7b6a80f_del complete
Oct 01 14:07:32 compute-0 nova_compute[192698]: 2025-10-01 14:07:32.070 2 INFO nova.compute.manager [None req-670900b5-6f8d-4b86-8279-ff6dc36d9146 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] Took 1.32 seconds to destroy the instance on the hypervisor.
Oct 01 14:07:32 compute-0 nova_compute[192698]: 2025-10-01 14:07:32.070 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-670900b5-6f8d-4b86-8279-ff6dc36d9146 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 01 14:07:32 compute-0 nova_compute[192698]: 2025-10-01 14:07:32.071 2 DEBUG nova.compute.manager [-] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 01 14:07:32 compute-0 nova_compute[192698]: 2025-10-01 14:07:32.071 2 DEBUG nova.network.neutron [-] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 01 14:07:32 compute-0 nova_compute[192698]: 2025-10-01 14:07:32.072 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:07:32 compute-0 nova_compute[192698]: 2025-10-01 14:07:32.848 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:33 compute-0 nova_compute[192698]: 2025-10-01 14:07:33.032 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:07:33 compute-0 nova_compute[192698]: 2025-10-01 14:07:33.201 2 DEBUG nova.compute.manager [req-3d6be4ed-a01d-43ee-840a-652bd485aa8b req-2eda57d4-aaf1-4b79-8a71-72bdffa77308 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] Received event network-vif-unplugged-c3ad6fe1-af17-4f0b-86c4-8d4384248932 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:07:33 compute-0 nova_compute[192698]: 2025-10-01 14:07:33.201 2 DEBUG oslo_concurrency.lockutils [req-3d6be4ed-a01d-43ee-840a-652bd485aa8b req-2eda57d4-aaf1-4b79-8a71-72bdffa77308 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "ad84d705-7d86-4faf-a1d4-b099e7b6a80f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:07:33 compute-0 nova_compute[192698]: 2025-10-01 14:07:33.201 2 DEBUG oslo_concurrency.lockutils [req-3d6be4ed-a01d-43ee-840a-652bd485aa8b req-2eda57d4-aaf1-4b79-8a71-72bdffa77308 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "ad84d705-7d86-4faf-a1d4-b099e7b6a80f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:07:33 compute-0 nova_compute[192698]: 2025-10-01 14:07:33.201 2 DEBUG oslo_concurrency.lockutils [req-3d6be4ed-a01d-43ee-840a-652bd485aa8b req-2eda57d4-aaf1-4b79-8a71-72bdffa77308 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "ad84d705-7d86-4faf-a1d4-b099e7b6a80f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:07:33 compute-0 nova_compute[192698]: 2025-10-01 14:07:33.202 2 DEBUG nova.compute.manager [req-3d6be4ed-a01d-43ee-840a-652bd485aa8b req-2eda57d4-aaf1-4b79-8a71-72bdffa77308 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] No waiting events found dispatching network-vif-unplugged-c3ad6fe1-af17-4f0b-86c4-8d4384248932 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:07:33 compute-0 nova_compute[192698]: 2025-10-01 14:07:33.202 2 DEBUG nova.compute.manager [req-3d6be4ed-a01d-43ee-840a-652bd485aa8b req-2eda57d4-aaf1-4b79-8a71-72bdffa77308 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] Received event network-vif-unplugged-c3ad6fe1-af17-4f0b-86c4-8d4384248932 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 01 14:07:34 compute-0 nova_compute[192698]: 2025-10-01 14:07:34.570 2 DEBUG nova.network.neutron [-] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:07:35 compute-0 nova_compute[192698]: 2025-10-01 14:07:35.077 2 INFO nova.compute.manager [-] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] Took 3.01 seconds to deallocate network for instance.
Oct 01 14:07:35 compute-0 nova_compute[192698]: 2025-10-01 14:07:35.269 2 DEBUG nova.compute.manager [req-111e98f5-75ca-4b36-b5b4-b00970a47817 req-302ffdbf-6896-4c1b-8b6b-7feed40a9da4 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ad84d705-7d86-4faf-a1d4-b099e7b6a80f] Received event network-vif-deleted-c3ad6fe1-af17-4f0b-86c4-8d4384248932 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:07:35 compute-0 nova_compute[192698]: 2025-10-01 14:07:35.602 2 DEBUG oslo_concurrency.lockutils [None req-670900b5-6f8d-4b86-8279-ff6dc36d9146 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:07:35 compute-0 nova_compute[192698]: 2025-10-01 14:07:35.603 2 DEBUG oslo_concurrency.lockutils [None req-670900b5-6f8d-4b86-8279-ff6dc36d9146 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:07:35 compute-0 nova_compute[192698]: 2025-10-01 14:07:35.732 2 DEBUG nova.compute.provider_tree [None req-670900b5-6f8d-4b86-8279-ff6dc36d9146 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:07:36 compute-0 nova_compute[192698]: 2025-10-01 14:07:36.241 2 DEBUG nova.scheduler.client.report [None req-670900b5-6f8d-4b86-8279-ff6dc36d9146 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:07:36 compute-0 nova_compute[192698]: 2025-10-01 14:07:36.589 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:36 compute-0 nova_compute[192698]: 2025-10-01 14:07:36.754 2 DEBUG oslo_concurrency.lockutils [None req-670900b5-6f8d-4b86-8279-ff6dc36d9146 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.151s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:07:36 compute-0 nova_compute[192698]: 2025-10-01 14:07:36.780 2 INFO nova.scheduler.client.report [None req-670900b5-6f8d-4b86-8279-ff6dc36d9146 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Deleted allocations for instance ad84d705-7d86-4faf-a1d4-b099e7b6a80f
Oct 01 14:07:37 compute-0 podman[217718]: 2025-10-01 14:07:37.185955603 +0000 UTC m=+0.090165926 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 01 14:07:37 compute-0 nova_compute[192698]: 2025-10-01 14:07:37.810 2 DEBUG oslo_concurrency.lockutils [None req-670900b5-6f8d-4b86-8279-ff6dc36d9146 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "ad84d705-7d86-4faf-a1d4-b099e7b6a80f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.591s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:07:37 compute-0 nova_compute[192698]: 2025-10-01 14:07:37.850 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:38 compute-0 nova_compute[192698]: 2025-10-01 14:07:38.513 2 DEBUG oslo_concurrency.lockutils [None req-73f5b35b-238c-485c-a322-f1f567252e14 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Acquiring lock "cc19e5cf-bf34-4a91-a2d7-519421be8b85" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:07:38 compute-0 nova_compute[192698]: 2025-10-01 14:07:38.514 2 DEBUG oslo_concurrency.lockutils [None req-73f5b35b-238c-485c-a322-f1f567252e14 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "cc19e5cf-bf34-4a91-a2d7-519421be8b85" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:07:38 compute-0 nova_compute[192698]: 2025-10-01 14:07:38.514 2 DEBUG oslo_concurrency.lockutils [None req-73f5b35b-238c-485c-a322-f1f567252e14 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Acquiring lock "cc19e5cf-bf34-4a91-a2d7-519421be8b85-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:07:38 compute-0 nova_compute[192698]: 2025-10-01 14:07:38.515 2 DEBUG oslo_concurrency.lockutils [None req-73f5b35b-238c-485c-a322-f1f567252e14 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "cc19e5cf-bf34-4a91-a2d7-519421be8b85-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:07:38 compute-0 nova_compute[192698]: 2025-10-01 14:07:38.515 2 DEBUG oslo_concurrency.lockutils [None req-73f5b35b-238c-485c-a322-f1f567252e14 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "cc19e5cf-bf34-4a91-a2d7-519421be8b85-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:07:38 compute-0 nova_compute[192698]: 2025-10-01 14:07:38.526 2 INFO nova.compute.manager [None req-73f5b35b-238c-485c-a322-f1f567252e14 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: cc19e5cf-bf34-4a91-a2d7-519421be8b85] Terminating instance
Oct 01 14:07:39 compute-0 nova_compute[192698]: 2025-10-01 14:07:39.047 2 DEBUG nova.compute.manager [None req-73f5b35b-238c-485c-a322-f1f567252e14 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: cc19e5cf-bf34-4a91-a2d7-519421be8b85] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 01 14:07:39 compute-0 kernel: tap9eb8c749-13 (unregistering): left promiscuous mode
Oct 01 14:07:39 compute-0 NetworkManager[51741]: <info>  [1759327659.0830] device (tap9eb8c749-13): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 01 14:07:39 compute-0 nova_compute[192698]: 2025-10-01 14:07:39.127 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:39 compute-0 ovn_controller[94909]: 2025-10-01T14:07:39Z|00075|binding|INFO|Releasing lport 9eb8c749-13c4-43a1-8edd-b95f31dba7be from this chassis (sb_readonly=0)
Oct 01 14:07:39 compute-0 ovn_controller[94909]: 2025-10-01T14:07:39Z|00076|binding|INFO|Setting lport 9eb8c749-13c4-43a1-8edd-b95f31dba7be down in Southbound
Oct 01 14:07:39 compute-0 ovn_controller[94909]: 2025-10-01T14:07:39Z|00077|binding|INFO|Removing iface tap9eb8c749-13 ovn-installed in OVS
Oct 01 14:07:39 compute-0 nova_compute[192698]: 2025-10-01 14:07:39.133 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:39.137 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ae:eb:01 10.100.0.11'], port_security=['fa:16:3e:ae:eb:01 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'cc19e5cf-bf34-4a91-a2d7-519421be8b85', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e35f096a-fd75-4d70-ae58-8a76ae666b9d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '67079b4774294271895bbf7b04f602e7', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'de7872a1-1f76-4b0f-8bd9-119520ff7a88', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e3a455d-1f77-441e-b08a-0ec8231910e5, chassis=[], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], logical_port=9eb8c749-13c4-43a1-8edd-b95f31dba7be) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:07:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:39.138 103791 INFO neutron.agent.ovn.metadata.agent [-] Port 9eb8c749-13c4-43a1-8edd-b95f31dba7be in datapath e35f096a-fd75-4d70-ae58-8a76ae666b9d unbound from our chassis
Oct 01 14:07:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:39.140 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e35f096a-fd75-4d70-ae58-8a76ae666b9d
Oct 01 14:07:39 compute-0 nova_compute[192698]: 2025-10-01 14:07:39.147 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:39.159 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[dfa60099-4bd0-4791-a4c5-301900c9c6e7]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:07:39 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000008.scope: Deactivated successfully.
Oct 01 14:07:39 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000008.scope: Consumed 16.648s CPU time.
Oct 01 14:07:39 compute-0 systemd-machined[152704]: Machine qemu-6-instance-00000008 terminated.
Oct 01 14:07:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:39.207 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[4c56acf6-70fd-4d60-a53c-f1ee2aa2458d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:07:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:39.210 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[e15d6164-e33e-41c8-82d4-bc007cc6b088]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:07:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:39.255 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[8ec691d9-54df-4c3b-bdd7-c93f41bb2c2d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:07:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:39.275 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[6482f55a-4b43-4e42-b232-b2920f9c6c91]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape35f096a-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:47:1b:50'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 36, 'tx_packets': 16, 'rx_bytes': 2008, 'tx_bytes': 864, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 36, 'tx_packets': 16, 'rx_bytes': 2008, 'tx_bytes': 864, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 382912, 'reachable_time': 20041, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217755, 'error': None, 'target': 'ovnmeta-e35f096a-fd75-4d70-ae58-8a76ae666b9d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:07:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:39.302 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[c0980deb-e7fd-4293-8ae0-b18d02fce682]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tape35f096a-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 382931, 'tstamp': 382931}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217761, 'error': None, 'target': 'ovnmeta-e35f096a-fd75-4d70-ae58-8a76ae666b9d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tape35f096a-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 382936, 'tstamp': 382936}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217761, 'error': None, 'target': 'ovnmeta-e35f096a-fd75-4d70-ae58-8a76ae666b9d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:07:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:39.304 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape35f096a-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:07:39 compute-0 nova_compute[192698]: 2025-10-01 14:07:39.311 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:39.319 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape35f096a-f0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:07:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:39.319 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:07:39 compute-0 nova_compute[192698]: 2025-10-01 14:07:39.319 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:39.320 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape35f096a-f0, col_values=(('external_ids', {'iface-id': '3f9111f1-79b1-4bf1-bb95-d924c71fb42c'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:07:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:39.320 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:07:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:39.322 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[71bc1e24-bade-4fc1-8d3d-6d169b47d4cd]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-e35f096a-fd75-4d70-ae58-8a76ae666b9d\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/e35f096a-fd75-4d70-ae58-8a76ae666b9d.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID e35f096a-fd75-4d70-ae58-8a76ae666b9d\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:07:39 compute-0 nova_compute[192698]: 2025-10-01 14:07:39.345 2 INFO nova.virt.libvirt.driver [-] [instance: cc19e5cf-bf34-4a91-a2d7-519421be8b85] Instance destroyed successfully.
Oct 01 14:07:39 compute-0 nova_compute[192698]: 2025-10-01 14:07:39.345 2 DEBUG nova.objects.instance [None req-73f5b35b-238c-485c-a322-f1f567252e14 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lazy-loading 'resources' on Instance uuid cc19e5cf-bf34-4a91-a2d7-519421be8b85 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:07:39 compute-0 nova_compute[192698]: 2025-10-01 14:07:39.425 2 DEBUG nova.compute.manager [req-7df7ff40-059c-4d40-af11-2c78e48dbc50 req-dc05c351-302c-4fbb-8288-4beb0cd832aa 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: cc19e5cf-bf34-4a91-a2d7-519421be8b85] Received event network-vif-unplugged-9eb8c749-13c4-43a1-8edd-b95f31dba7be external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:07:39 compute-0 nova_compute[192698]: 2025-10-01 14:07:39.425 2 DEBUG oslo_concurrency.lockutils [req-7df7ff40-059c-4d40-af11-2c78e48dbc50 req-dc05c351-302c-4fbb-8288-4beb0cd832aa 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "cc19e5cf-bf34-4a91-a2d7-519421be8b85-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:07:39 compute-0 nova_compute[192698]: 2025-10-01 14:07:39.425 2 DEBUG oslo_concurrency.lockutils [req-7df7ff40-059c-4d40-af11-2c78e48dbc50 req-dc05c351-302c-4fbb-8288-4beb0cd832aa 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "cc19e5cf-bf34-4a91-a2d7-519421be8b85-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:07:39 compute-0 nova_compute[192698]: 2025-10-01 14:07:39.425 2 DEBUG oslo_concurrency.lockutils [req-7df7ff40-059c-4d40-af11-2c78e48dbc50 req-dc05c351-302c-4fbb-8288-4beb0cd832aa 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "cc19e5cf-bf34-4a91-a2d7-519421be8b85-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:07:39 compute-0 nova_compute[192698]: 2025-10-01 14:07:39.425 2 DEBUG nova.compute.manager [req-7df7ff40-059c-4d40-af11-2c78e48dbc50 req-dc05c351-302c-4fbb-8288-4beb0cd832aa 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: cc19e5cf-bf34-4a91-a2d7-519421be8b85] No waiting events found dispatching network-vif-unplugged-9eb8c749-13c4-43a1-8edd-b95f31dba7be pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:07:39 compute-0 nova_compute[192698]: 2025-10-01 14:07:39.426 2 DEBUG nova.compute.manager [req-7df7ff40-059c-4d40-af11-2c78e48dbc50 req-dc05c351-302c-4fbb-8288-4beb0cd832aa 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: cc19e5cf-bf34-4a91-a2d7-519421be8b85] Received event network-vif-unplugged-9eb8c749-13c4-43a1-8edd-b95f31dba7be for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 01 14:07:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:39.487 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'e2:3f:3c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:1d:a6:67:ed:e6'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:07:39 compute-0 nova_compute[192698]: 2025-10-01 14:07:39.488 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:39.489 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 01 14:07:39 compute-0 nova_compute[192698]: 2025-10-01 14:07:39.852 2 DEBUG nova.virt.libvirt.vif [None req-73f5b35b-238c-485c-a322-f1f567252e14 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-10-01T14:06:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-290041760',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-290041760',id=8,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-01T14:07:13Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='67079b4774294271895bbf7b04f602e7',ramdisk_id='',reservation_id='r-nzqp9uwt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-2075848047',owner_user_name='tempest-TestExecuteActionsViaActuator-2075848047-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-01T14:07:26Z,user_data=None,user_id='82619989ef1f48a39f1c1e7d64e4cb38',uuid=cc19e5cf-bf34-4a91-a2d7-519421be8b85,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9eb8c749-13c4-43a1-8edd-b95f31dba7be", "address": "fa:16:3e:ae:eb:01", "network": {"id": "e35f096a-fd75-4d70-ae58-8a76ae666b9d", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1299231587-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b14b3910fae84828afa468e1e645402b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9eb8c749-13", "ovs_interfaceid": "9eb8c749-13c4-43a1-8edd-b95f31dba7be", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 01 14:07:39 compute-0 nova_compute[192698]: 2025-10-01 14:07:39.853 2 DEBUG nova.network.os_vif_util [None req-73f5b35b-238c-485c-a322-f1f567252e14 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Converting VIF {"id": "9eb8c749-13c4-43a1-8edd-b95f31dba7be", "address": "fa:16:3e:ae:eb:01", "network": {"id": "e35f096a-fd75-4d70-ae58-8a76ae666b9d", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1299231587-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b14b3910fae84828afa468e1e645402b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9eb8c749-13", "ovs_interfaceid": "9eb8c749-13c4-43a1-8edd-b95f31dba7be", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:07:39 compute-0 nova_compute[192698]: 2025-10-01 14:07:39.854 2 DEBUG nova.network.os_vif_util [None req-73f5b35b-238c-485c-a322-f1f567252e14 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ae:eb:01,bridge_name='br-int',has_traffic_filtering=True,id=9eb8c749-13c4-43a1-8edd-b95f31dba7be,network=Network(e35f096a-fd75-4d70-ae58-8a76ae666b9d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9eb8c749-13') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:07:39 compute-0 nova_compute[192698]: 2025-10-01 14:07:39.854 2 DEBUG os_vif [None req-73f5b35b-238c-485c-a322-f1f567252e14 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ae:eb:01,bridge_name='br-int',has_traffic_filtering=True,id=9eb8c749-13c4-43a1-8edd-b95f31dba7be,network=Network(e35f096a-fd75-4d70-ae58-8a76ae666b9d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9eb8c749-13') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 01 14:07:39 compute-0 nova_compute[192698]: 2025-10-01 14:07:39.856 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:39 compute-0 nova_compute[192698]: 2025-10-01 14:07:39.857 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9eb8c749-13, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:07:39 compute-0 nova_compute[192698]: 2025-10-01 14:07:39.859 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:39 compute-0 nova_compute[192698]: 2025-10-01 14:07:39.861 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:07:39 compute-0 nova_compute[192698]: 2025-10-01 14:07:39.862 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:39 compute-0 nova_compute[192698]: 2025-10-01 14:07:39.863 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:39 compute-0 nova_compute[192698]: 2025-10-01 14:07:39.863 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=1ee268b0-44c1-4fee-b6d9-68906040e4ca) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:07:39 compute-0 nova_compute[192698]: 2025-10-01 14:07:39.864 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:39 compute-0 nova_compute[192698]: 2025-10-01 14:07:39.866 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:07:39 compute-0 nova_compute[192698]: 2025-10-01 14:07:39.869 2 INFO os_vif [None req-73f5b35b-238c-485c-a322-f1f567252e14 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ae:eb:01,bridge_name='br-int',has_traffic_filtering=True,id=9eb8c749-13c4-43a1-8edd-b95f31dba7be,network=Network(e35f096a-fd75-4d70-ae58-8a76ae666b9d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9eb8c749-13')
Oct 01 14:07:39 compute-0 nova_compute[192698]: 2025-10-01 14:07:39.870 2 INFO nova.virt.libvirt.driver [None req-73f5b35b-238c-485c-a322-f1f567252e14 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: cc19e5cf-bf34-4a91-a2d7-519421be8b85] Deleting instance files /var/lib/nova/instances/cc19e5cf-bf34-4a91-a2d7-519421be8b85_del
Oct 01 14:07:39 compute-0 nova_compute[192698]: 2025-10-01 14:07:39.876 2 INFO nova.virt.libvirt.driver [None req-73f5b35b-238c-485c-a322-f1f567252e14 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: cc19e5cf-bf34-4a91-a2d7-519421be8b85] Deletion of /var/lib/nova/instances/cc19e5cf-bf34-4a91-a2d7-519421be8b85_del complete
Oct 01 14:07:40 compute-0 nova_compute[192698]: 2025-10-01 14:07:40.399 2 INFO nova.compute.manager [None req-73f5b35b-238c-485c-a322-f1f567252e14 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: cc19e5cf-bf34-4a91-a2d7-519421be8b85] Took 1.35 seconds to destroy the instance on the hypervisor.
Oct 01 14:07:40 compute-0 nova_compute[192698]: 2025-10-01 14:07:40.400 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-73f5b35b-238c-485c-a322-f1f567252e14 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 01 14:07:40 compute-0 nova_compute[192698]: 2025-10-01 14:07:40.400 2 DEBUG nova.compute.manager [-] [instance: cc19e5cf-bf34-4a91-a2d7-519421be8b85] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 01 14:07:40 compute-0 nova_compute[192698]: 2025-10-01 14:07:40.400 2 DEBUG nova.network.neutron [-] [instance: cc19e5cf-bf34-4a91-a2d7-519421be8b85] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 01 14:07:40 compute-0 nova_compute[192698]: 2025-10-01 14:07:40.401 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:07:40 compute-0 nova_compute[192698]: 2025-10-01 14:07:40.527 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:07:40 compute-0 nova_compute[192698]: 2025-10-01 14:07:40.825 2 DEBUG nova.compute.manager [req-6b9e2729-57db-4718-a610-82d890f36ef6 req-3d52ddeb-0b9d-4354-ba98-ec85897ba8eb 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: cc19e5cf-bf34-4a91-a2d7-519421be8b85] Received event network-vif-deleted-9eb8c749-13c4-43a1-8edd-b95f31dba7be external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:07:40 compute-0 nova_compute[192698]: 2025-10-01 14:07:40.825 2 INFO nova.compute.manager [req-6b9e2729-57db-4718-a610-82d890f36ef6 req-3d52ddeb-0b9d-4354-ba98-ec85897ba8eb 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: cc19e5cf-bf34-4a91-a2d7-519421be8b85] Neutron deleted interface 9eb8c749-13c4-43a1-8edd-b95f31dba7be; detaching it from the instance and deleting it from the info cache
Oct 01 14:07:40 compute-0 nova_compute[192698]: 2025-10-01 14:07:40.825 2 DEBUG nova.network.neutron [req-6b9e2729-57db-4718-a610-82d890f36ef6 req-3d52ddeb-0b9d-4354-ba98-ec85897ba8eb 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: cc19e5cf-bf34-4a91-a2d7-519421be8b85] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:07:41 compute-0 nova_compute[192698]: 2025-10-01 14:07:41.280 2 DEBUG nova.network.neutron [-] [instance: cc19e5cf-bf34-4a91-a2d7-519421be8b85] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:07:41 compute-0 nova_compute[192698]: 2025-10-01 14:07:41.332 2 DEBUG nova.compute.manager [req-6b9e2729-57db-4718-a610-82d890f36ef6 req-3d52ddeb-0b9d-4354-ba98-ec85897ba8eb 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: cc19e5cf-bf34-4a91-a2d7-519421be8b85] Detach interface failed, port_id=9eb8c749-13c4-43a1-8edd-b95f31dba7be, reason: Instance cc19e5cf-bf34-4a91-a2d7-519421be8b85 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Oct 01 14:07:41 compute-0 nova_compute[192698]: 2025-10-01 14:07:41.507 2 DEBUG nova.compute.manager [req-ace8e13f-0ac7-48bd-8c39-ed23956230ce req-ffec685d-0672-466b-93ad-e561ee1e1843 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: cc19e5cf-bf34-4a91-a2d7-519421be8b85] Received event network-vif-unplugged-9eb8c749-13c4-43a1-8edd-b95f31dba7be external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:07:41 compute-0 nova_compute[192698]: 2025-10-01 14:07:41.508 2 DEBUG oslo_concurrency.lockutils [req-ace8e13f-0ac7-48bd-8c39-ed23956230ce req-ffec685d-0672-466b-93ad-e561ee1e1843 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "cc19e5cf-bf34-4a91-a2d7-519421be8b85-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:07:41 compute-0 nova_compute[192698]: 2025-10-01 14:07:41.508 2 DEBUG oslo_concurrency.lockutils [req-ace8e13f-0ac7-48bd-8c39-ed23956230ce req-ffec685d-0672-466b-93ad-e561ee1e1843 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "cc19e5cf-bf34-4a91-a2d7-519421be8b85-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:07:41 compute-0 nova_compute[192698]: 2025-10-01 14:07:41.508 2 DEBUG oslo_concurrency.lockutils [req-ace8e13f-0ac7-48bd-8c39-ed23956230ce req-ffec685d-0672-466b-93ad-e561ee1e1843 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "cc19e5cf-bf34-4a91-a2d7-519421be8b85-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:07:41 compute-0 nova_compute[192698]: 2025-10-01 14:07:41.509 2 DEBUG nova.compute.manager [req-ace8e13f-0ac7-48bd-8c39-ed23956230ce req-ffec685d-0672-466b-93ad-e561ee1e1843 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: cc19e5cf-bf34-4a91-a2d7-519421be8b85] No waiting events found dispatching network-vif-unplugged-9eb8c749-13c4-43a1-8edd-b95f31dba7be pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:07:41 compute-0 nova_compute[192698]: 2025-10-01 14:07:41.509 2 DEBUG nova.compute.manager [req-ace8e13f-0ac7-48bd-8c39-ed23956230ce req-ffec685d-0672-466b-93ad-e561ee1e1843 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: cc19e5cf-bf34-4a91-a2d7-519421be8b85] Received event network-vif-unplugged-9eb8c749-13c4-43a1-8edd-b95f31dba7be for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 01 14:07:41 compute-0 nova_compute[192698]: 2025-10-01 14:07:41.787 2 INFO nova.compute.manager [-] [instance: cc19e5cf-bf34-4a91-a2d7-519421be8b85] Took 1.39 seconds to deallocate network for instance.
Oct 01 14:07:42 compute-0 nova_compute[192698]: 2025-10-01 14:07:42.316 2 DEBUG oslo_concurrency.lockutils [None req-73f5b35b-238c-485c-a322-f1f567252e14 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:07:42 compute-0 nova_compute[192698]: 2025-10-01 14:07:42.316 2 DEBUG oslo_concurrency.lockutils [None req-73f5b35b-238c-485c-a322-f1f567252e14 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:07:42 compute-0 nova_compute[192698]: 2025-10-01 14:07:42.419 2 DEBUG nova.compute.provider_tree [None req-73f5b35b-238c-485c-a322-f1f567252e14 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:07:42 compute-0 nova_compute[192698]: 2025-10-01 14:07:42.853 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:43 compute-0 nova_compute[192698]: 2025-10-01 14:07:43.012 2 DEBUG nova.scheduler.client.report [None req-73f5b35b-238c-485c-a322-f1f567252e14 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:07:43 compute-0 nova_compute[192698]: 2025-10-01 14:07:43.579 2 DEBUG oslo_concurrency.lockutils [None req-73f5b35b-238c-485c-a322-f1f567252e14 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.263s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:07:43 compute-0 nova_compute[192698]: 2025-10-01 14:07:43.805 2 INFO nova.scheduler.client.report [None req-73f5b35b-238c-485c-a322-f1f567252e14 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Deleted allocations for instance cc19e5cf-bf34-4a91-a2d7-519421be8b85
Oct 01 14:07:44 compute-0 nova_compute[192698]: 2025-10-01 14:07:44.865 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:44 compute-0 nova_compute[192698]: 2025-10-01 14:07:44.892 2 DEBUG oslo_concurrency.lockutils [None req-73f5b35b-238c-485c-a322-f1f567252e14 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "cc19e5cf-bf34-4a91-a2d7-519421be8b85" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.378s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:07:47 compute-0 podman[217775]: 2025-10-01 14:07:47.189616132 +0000 UTC m=+0.095935701 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Oct 01 14:07:47 compute-0 podman[217776]: 2025-10-01 14:07:47.251205669 +0000 UTC m=+0.155098623 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 01 14:07:47 compute-0 nova_compute[192698]: 2025-10-01 14:07:47.428 2 DEBUG oslo_concurrency.lockutils [None req-648fbfa3-c3a2-446f-bea9-7731ec0c7d6f 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Acquiring lock "ff5702e3-c6c5-4b82-a9c4-6a06747a4cae" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:07:47 compute-0 nova_compute[192698]: 2025-10-01 14:07:47.429 2 DEBUG oslo_concurrency.lockutils [None req-648fbfa3-c3a2-446f-bea9-7731ec0c7d6f 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "ff5702e3-c6c5-4b82-a9c4-6a06747a4cae" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:07:47 compute-0 nova_compute[192698]: 2025-10-01 14:07:47.430 2 DEBUG oslo_concurrency.lockutils [None req-648fbfa3-c3a2-446f-bea9-7731ec0c7d6f 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Acquiring lock "ff5702e3-c6c5-4b82-a9c4-6a06747a4cae-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:07:47 compute-0 nova_compute[192698]: 2025-10-01 14:07:47.430 2 DEBUG oslo_concurrency.lockutils [None req-648fbfa3-c3a2-446f-bea9-7731ec0c7d6f 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "ff5702e3-c6c5-4b82-a9c4-6a06747a4cae-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:07:47 compute-0 nova_compute[192698]: 2025-10-01 14:07:47.431 2 DEBUG oslo_concurrency.lockutils [None req-648fbfa3-c3a2-446f-bea9-7731ec0c7d6f 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "ff5702e3-c6c5-4b82-a9c4-6a06747a4cae-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:07:47 compute-0 nova_compute[192698]: 2025-10-01 14:07:47.451 2 INFO nova.compute.manager [None req-648fbfa3-c3a2-446f-bea9-7731ec0c7d6f 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] Terminating instance
Oct 01 14:07:47 compute-0 nova_compute[192698]: 2025-10-01 14:07:47.898 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:47 compute-0 nova_compute[192698]: 2025-10-01 14:07:47.978 2 DEBUG nova.compute.manager [None req-648fbfa3-c3a2-446f-bea9-7731ec0c7d6f 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 01 14:07:47 compute-0 kernel: tap9e1db054-d5 (unregistering): left promiscuous mode
Oct 01 14:07:48 compute-0 NetworkManager[51741]: <info>  [1759327668.0080] device (tap9e1db054-d5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 01 14:07:48 compute-0 ovn_controller[94909]: 2025-10-01T14:07:48Z|00078|binding|INFO|Releasing lport 9e1db054-d550-4384-9fd6-118c2eea0c89 from this chassis (sb_readonly=0)
Oct 01 14:07:48 compute-0 ovn_controller[94909]: 2025-10-01T14:07:48Z|00079|binding|INFO|Setting lport 9e1db054-d550-4384-9fd6-118c2eea0c89 down in Southbound
Oct 01 14:07:48 compute-0 nova_compute[192698]: 2025-10-01 14:07:48.023 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:48 compute-0 ovn_controller[94909]: 2025-10-01T14:07:48Z|00080|binding|INFO|Removing iface tap9e1db054-d5 ovn-installed in OVS
Oct 01 14:07:48 compute-0 nova_compute[192698]: 2025-10-01 14:07:48.027 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:48 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:48.032 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2e:d6:d3 10.100.0.8'], port_security=['fa:16:3e:2e:d6:d3 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'ff5702e3-c6c5-4b82-a9c4-6a06747a4cae', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e35f096a-fd75-4d70-ae58-8a76ae666b9d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '67079b4774294271895bbf7b04f602e7', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'de7872a1-1f76-4b0f-8bd9-119520ff7a88', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e3a455d-1f77-441e-b08a-0ec8231910e5, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], logical_port=9e1db054-d550-4384-9fd6-118c2eea0c89) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:07:48 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:48.033 103791 INFO neutron.agent.ovn.metadata.agent [-] Port 9e1db054-d550-4384-9fd6-118c2eea0c89 in datapath e35f096a-fd75-4d70-ae58-8a76ae666b9d unbound from our chassis
Oct 01 14:07:48 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:48.035 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e35f096a-fd75-4d70-ae58-8a76ae666b9d
Oct 01 14:07:48 compute-0 nova_compute[192698]: 2025-10-01 14:07:48.048 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:48 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:48.060 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[4597babc-a21d-400a-b693-a3dfc457d231]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:07:48 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000007.scope: Deactivated successfully.
Oct 01 14:07:48 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000007.scope: Consumed 17.958s CPU time.
Oct 01 14:07:48 compute-0 systemd-machined[152704]: Machine qemu-3-instance-00000007 terminated.
Oct 01 14:07:48 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:48.098 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[42a1ca3f-f6cf-4ab0-9c7d-4af9769529c3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:07:48 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:48.101 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[a984e24c-fd51-49b1-a203-d7002b8bc705]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:07:48 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:48.147 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[f45cc157-174b-4f93-95ac-832ec7b173c0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:07:48 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:48.176 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[27a7c660-115d-49a8-b238-4de36d82dc35]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape35f096a-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:47:1b:50'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 36, 'tx_packets': 18, 'rx_bytes': 2008, 'tx_bytes': 948, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 36, 'tx_packets': 18, 'rx_bytes': 2008, 'tx_bytes': 948, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 382912, 'reachable_time': 20041, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217833, 'error': None, 'target': 'ovnmeta-e35f096a-fd75-4d70-ae58-8a76ae666b9d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:07:48 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:48.204 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[b8c599f1-8954-4500-b3fc-0fa6b9800d92]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tape35f096a-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 382931, 'tstamp': 382931}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217834, 'error': None, 'target': 'ovnmeta-e35f096a-fd75-4d70-ae58-8a76ae666b9d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tape35f096a-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 382936, 'tstamp': 382936}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217834, 'error': None, 'target': 'ovnmeta-e35f096a-fd75-4d70-ae58-8a76ae666b9d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:07:48 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:48.207 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape35f096a-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:07:48 compute-0 nova_compute[192698]: 2025-10-01 14:07:48.209 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:48 compute-0 nova_compute[192698]: 2025-10-01 14:07:48.221 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:48 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:48.221 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape35f096a-f0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:07:48 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:48.222 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:07:48 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:48.222 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape35f096a-f0, col_values=(('external_ids', {'iface-id': '3f9111f1-79b1-4bf1-bb95-d924c71fb42c'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:07:48 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:48.223 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:07:48 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:48.224 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[7d17da47-03d9-468d-9539-f7bf9659df85]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-e35f096a-fd75-4d70-ae58-8a76ae666b9d\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/e35f096a-fd75-4d70-ae58-8a76ae666b9d.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID e35f096a-fd75-4d70-ae58-8a76ae666b9d\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:07:48 compute-0 nova_compute[192698]: 2025-10-01 14:07:48.268 2 INFO nova.virt.libvirt.driver [-] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] Instance destroyed successfully.
Oct 01 14:07:48 compute-0 nova_compute[192698]: 2025-10-01 14:07:48.269 2 DEBUG nova.objects.instance [None req-648fbfa3-c3a2-446f-bea9-7731ec0c7d6f 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lazy-loading 'resources' on Instance uuid ff5702e3-c6c5-4b82-a9c4-6a06747a4cae obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:07:48 compute-0 nova_compute[192698]: 2025-10-01 14:07:48.776 2 DEBUG nova.virt.libvirt.vif [None req-648fbfa3-c3a2-446f-bea9-7731ec0c7d6f 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-10-01T14:05:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1626632076',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1626632076',id=7,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-01T14:05:54Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='67079b4774294271895bbf7b04f602e7',ramdisk_id='',reservation_id='r-yli9a99r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-2075848047',owner_user_name='tempest-TestExecuteActionsViaActuator-2075848047-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-01T14:05:54Z,user_data=None,user_id='82619989ef1f48a39f1c1e7d64e4cb38',uuid=ff5702e3-c6c5-4b82-a9c4-6a06747a4cae,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9e1db054-d550-4384-9fd6-118c2eea0c89", "address": "fa:16:3e:2e:d6:d3", "network": {"id": "e35f096a-fd75-4d70-ae58-8a76ae666b9d", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1299231587-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b14b3910fae84828afa468e1e645402b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e1db054-d5", "ovs_interfaceid": "9e1db054-d550-4384-9fd6-118c2eea0c89", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 01 14:07:48 compute-0 nova_compute[192698]: 2025-10-01 14:07:48.777 2 DEBUG nova.network.os_vif_util [None req-648fbfa3-c3a2-446f-bea9-7731ec0c7d6f 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Converting VIF {"id": "9e1db054-d550-4384-9fd6-118c2eea0c89", "address": "fa:16:3e:2e:d6:d3", "network": {"id": "e35f096a-fd75-4d70-ae58-8a76ae666b9d", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1299231587-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b14b3910fae84828afa468e1e645402b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9e1db054-d5", "ovs_interfaceid": "9e1db054-d550-4384-9fd6-118c2eea0c89", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:07:48 compute-0 nova_compute[192698]: 2025-10-01 14:07:48.778 2 DEBUG nova.network.os_vif_util [None req-648fbfa3-c3a2-446f-bea9-7731ec0c7d6f 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2e:d6:d3,bridge_name='br-int',has_traffic_filtering=True,id=9e1db054-d550-4384-9fd6-118c2eea0c89,network=Network(e35f096a-fd75-4d70-ae58-8a76ae666b9d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9e1db054-d5') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:07:48 compute-0 nova_compute[192698]: 2025-10-01 14:07:48.779 2 DEBUG os_vif [None req-648fbfa3-c3a2-446f-bea9-7731ec0c7d6f 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2e:d6:d3,bridge_name='br-int',has_traffic_filtering=True,id=9e1db054-d550-4384-9fd6-118c2eea0c89,network=Network(e35f096a-fd75-4d70-ae58-8a76ae666b9d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9e1db054-d5') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 01 14:07:48 compute-0 nova_compute[192698]: 2025-10-01 14:07:48.781 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:48 compute-0 nova_compute[192698]: 2025-10-01 14:07:48.782 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9e1db054-d5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:07:48 compute-0 nova_compute[192698]: 2025-10-01 14:07:48.784 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:48 compute-0 nova_compute[192698]: 2025-10-01 14:07:48.787 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:07:48 compute-0 nova_compute[192698]: 2025-10-01 14:07:48.789 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:48 compute-0 nova_compute[192698]: 2025-10-01 14:07:48.789 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=d03bfbf1-5097-42dd-8e4d-97d3e2000f4e) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:07:48 compute-0 nova_compute[192698]: 2025-10-01 14:07:48.790 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:48 compute-0 nova_compute[192698]: 2025-10-01 14:07:48.792 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:48 compute-0 nova_compute[192698]: 2025-10-01 14:07:48.795 2 INFO os_vif [None req-648fbfa3-c3a2-446f-bea9-7731ec0c7d6f 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2e:d6:d3,bridge_name='br-int',has_traffic_filtering=True,id=9e1db054-d550-4384-9fd6-118c2eea0c89,network=Network(e35f096a-fd75-4d70-ae58-8a76ae666b9d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9e1db054-d5')
Oct 01 14:07:48 compute-0 nova_compute[192698]: 2025-10-01 14:07:48.796 2 INFO nova.virt.libvirt.driver [None req-648fbfa3-c3a2-446f-bea9-7731ec0c7d6f 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] Deleting instance files /var/lib/nova/instances/ff5702e3-c6c5-4b82-a9c4-6a06747a4cae_del
Oct 01 14:07:48 compute-0 nova_compute[192698]: 2025-10-01 14:07:48.797 2 INFO nova.virt.libvirt.driver [None req-648fbfa3-c3a2-446f-bea9-7731ec0c7d6f 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] Deletion of /var/lib/nova/instances/ff5702e3-c6c5-4b82-a9c4-6a06747a4cae_del complete
Oct 01 14:07:49 compute-0 nova_compute[192698]: 2025-10-01 14:07:49.431 2 DEBUG nova.compute.manager [req-8f2c9129-5085-49a1-bbfd-08c2faf10854 req-1ffdc771-8b4e-46cd-b514-ba3a368a3f08 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] Received event network-vif-unplugged-9e1db054-d550-4384-9fd6-118c2eea0c89 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:07:49 compute-0 nova_compute[192698]: 2025-10-01 14:07:49.432 2 DEBUG oslo_concurrency.lockutils [req-8f2c9129-5085-49a1-bbfd-08c2faf10854 req-1ffdc771-8b4e-46cd-b514-ba3a368a3f08 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "ff5702e3-c6c5-4b82-a9c4-6a06747a4cae-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:07:49 compute-0 nova_compute[192698]: 2025-10-01 14:07:49.432 2 DEBUG oslo_concurrency.lockutils [req-8f2c9129-5085-49a1-bbfd-08c2faf10854 req-1ffdc771-8b4e-46cd-b514-ba3a368a3f08 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "ff5702e3-c6c5-4b82-a9c4-6a06747a4cae-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:07:49 compute-0 nova_compute[192698]: 2025-10-01 14:07:49.433 2 DEBUG oslo_concurrency.lockutils [req-8f2c9129-5085-49a1-bbfd-08c2faf10854 req-1ffdc771-8b4e-46cd-b514-ba3a368a3f08 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "ff5702e3-c6c5-4b82-a9c4-6a06747a4cae-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:07:49 compute-0 nova_compute[192698]: 2025-10-01 14:07:49.433 2 DEBUG nova.compute.manager [req-8f2c9129-5085-49a1-bbfd-08c2faf10854 req-1ffdc771-8b4e-46cd-b514-ba3a368a3f08 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] No waiting events found dispatching network-vif-unplugged-9e1db054-d550-4384-9fd6-118c2eea0c89 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:07:49 compute-0 nova_compute[192698]: 2025-10-01 14:07:49.434 2 DEBUG nova.compute.manager [req-8f2c9129-5085-49a1-bbfd-08c2faf10854 req-1ffdc771-8b4e-46cd-b514-ba3a368a3f08 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] Received event network-vif-unplugged-9e1db054-d550-4384-9fd6-118c2eea0c89 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 01 14:07:49 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:49.493 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=10cf9814-09fa-4bad-879a-270f9b64eda3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:07:49 compute-0 nova_compute[192698]: 2025-10-01 14:07:49.869 2 INFO nova.compute.manager [None req-648fbfa3-c3a2-446f-bea9-7731ec0c7d6f 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] Took 1.89 seconds to destroy the instance on the hypervisor.
Oct 01 14:07:49 compute-0 nova_compute[192698]: 2025-10-01 14:07:49.870 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-648fbfa3-c3a2-446f-bea9-7731ec0c7d6f 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 01 14:07:49 compute-0 nova_compute[192698]: 2025-10-01 14:07:49.870 2 DEBUG nova.compute.manager [-] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 01 14:07:49 compute-0 nova_compute[192698]: 2025-10-01 14:07:49.871 2 DEBUG nova.network.neutron [-] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 01 14:07:49 compute-0 nova_compute[192698]: 2025-10-01 14:07:49.871 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:07:49 compute-0 nova_compute[192698]: 2025-10-01 14:07:49.961 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:07:51 compute-0 nova_compute[192698]: 2025-10-01 14:07:51.113 2 DEBUG nova.compute.manager [req-6b9c9993-6a52-4138-98b8-ecc36782e7dc req-bc1dc32a-e599-4bad-b658-b4937c1e49b7 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] Received event network-vif-deleted-9e1db054-d550-4384-9fd6-118c2eea0c89 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:07:51 compute-0 nova_compute[192698]: 2025-10-01 14:07:51.114 2 INFO nova.compute.manager [req-6b9c9993-6a52-4138-98b8-ecc36782e7dc req-bc1dc32a-e599-4bad-b658-b4937c1e49b7 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] Neutron deleted interface 9e1db054-d550-4384-9fd6-118c2eea0c89; detaching it from the instance and deleting it from the info cache
Oct 01 14:07:51 compute-0 nova_compute[192698]: 2025-10-01 14:07:51.114 2 DEBUG nova.network.neutron [req-6b9c9993-6a52-4138-98b8-ecc36782e7dc req-bc1dc32a-e599-4bad-b658-b4937c1e49b7 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:07:51 compute-0 nova_compute[192698]: 2025-10-01 14:07:51.488 2 DEBUG nova.compute.manager [req-e43f01b6-1ac5-4d4d-acd1-682c62a8439e req-6b83da76-2517-4644-ab2e-aae21a7b062e 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] Received event network-vif-unplugged-9e1db054-d550-4384-9fd6-118c2eea0c89 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:07:51 compute-0 nova_compute[192698]: 2025-10-01 14:07:51.488 2 DEBUG oslo_concurrency.lockutils [req-e43f01b6-1ac5-4d4d-acd1-682c62a8439e req-6b83da76-2517-4644-ab2e-aae21a7b062e 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "ff5702e3-c6c5-4b82-a9c4-6a06747a4cae-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:07:51 compute-0 nova_compute[192698]: 2025-10-01 14:07:51.489 2 DEBUG oslo_concurrency.lockutils [req-e43f01b6-1ac5-4d4d-acd1-682c62a8439e req-6b83da76-2517-4644-ab2e-aae21a7b062e 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "ff5702e3-c6c5-4b82-a9c4-6a06747a4cae-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:07:51 compute-0 nova_compute[192698]: 2025-10-01 14:07:51.489 2 DEBUG oslo_concurrency.lockutils [req-e43f01b6-1ac5-4d4d-acd1-682c62a8439e req-6b83da76-2517-4644-ab2e-aae21a7b062e 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "ff5702e3-c6c5-4b82-a9c4-6a06747a4cae-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:07:51 compute-0 nova_compute[192698]: 2025-10-01 14:07:51.490 2 DEBUG nova.compute.manager [req-e43f01b6-1ac5-4d4d-acd1-682c62a8439e req-6b83da76-2517-4644-ab2e-aae21a7b062e 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] No waiting events found dispatching network-vif-unplugged-9e1db054-d550-4384-9fd6-118c2eea0c89 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:07:51 compute-0 nova_compute[192698]: 2025-10-01 14:07:51.490 2 DEBUG nova.compute.manager [req-e43f01b6-1ac5-4d4d-acd1-682c62a8439e req-6b83da76-2517-4644-ab2e-aae21a7b062e 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] Received event network-vif-unplugged-9e1db054-d550-4384-9fd6-118c2eea0c89 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 01 14:07:51 compute-0 nova_compute[192698]: 2025-10-01 14:07:51.574 2 DEBUG nova.network.neutron [-] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:07:51 compute-0 nova_compute[192698]: 2025-10-01 14:07:51.624 2 DEBUG nova.compute.manager [req-6b9c9993-6a52-4138-98b8-ecc36782e7dc req-bc1dc32a-e599-4bad-b658-b4937c1e49b7 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] Detach interface failed, port_id=9e1db054-d550-4384-9fd6-118c2eea0c89, reason: Instance ff5702e3-c6c5-4b82-a9c4-6a06747a4cae could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Oct 01 14:07:52 compute-0 nova_compute[192698]: 2025-10-01 14:07:52.079 2 INFO nova.compute.manager [-] [instance: ff5702e3-c6c5-4b82-a9c4-6a06747a4cae] Took 2.21 seconds to deallocate network for instance.
Oct 01 14:07:52 compute-0 nova_compute[192698]: 2025-10-01 14:07:52.604 2 DEBUG oslo_concurrency.lockutils [None req-648fbfa3-c3a2-446f-bea9-7731ec0c7d6f 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:07:52 compute-0 nova_compute[192698]: 2025-10-01 14:07:52.604 2 DEBUG oslo_concurrency.lockutils [None req-648fbfa3-c3a2-446f-bea9-7731ec0c7d6f 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:07:52 compute-0 nova_compute[192698]: 2025-10-01 14:07:52.695 2 DEBUG nova.compute.provider_tree [None req-648fbfa3-c3a2-446f-bea9-7731ec0c7d6f 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:07:52 compute-0 nova_compute[192698]: 2025-10-01 14:07:52.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:53 compute-0 nova_compute[192698]: 2025-10-01 14:07:53.203 2 DEBUG nova.scheduler.client.report [None req-648fbfa3-c3a2-446f-bea9-7731ec0c7d6f 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:07:53 compute-0 nova_compute[192698]: 2025-10-01 14:07:53.717 2 DEBUG oslo_concurrency.lockutils [None req-648fbfa3-c3a2-446f-bea9-7731ec0c7d6f 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.113s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:07:53 compute-0 nova_compute[192698]: 2025-10-01 14:07:53.743 2 INFO nova.scheduler.client.report [None req-648fbfa3-c3a2-446f-bea9-7731ec0c7d6f 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Deleted allocations for instance ff5702e3-c6c5-4b82-a9c4-6a06747a4cae
Oct 01 14:07:53 compute-0 nova_compute[192698]: 2025-10-01 14:07:53.840 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:54 compute-0 nova_compute[192698]: 2025-10-01 14:07:54.784 2 DEBUG oslo_concurrency.lockutils [None req-648fbfa3-c3a2-446f-bea9-7731ec0c7d6f 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "ff5702e3-c6c5-4b82-a9c4-6a06747a4cae" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.355s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:07:55 compute-0 podman[217853]: 2025-10-01 14:07:55.176514178 +0000 UTC m=+0.090603448 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.buildah.version=1.33.7)
Oct 01 14:07:55 compute-0 nova_compute[192698]: 2025-10-01 14:07:55.638 2 DEBUG oslo_concurrency.lockutils [None req-830edac4-90de-4293-bf97-7f101cb0db31 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Acquiring lock "c5470fba-81f4-4592-8b40-1027a4dc1c83" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:07:55 compute-0 nova_compute[192698]: 2025-10-01 14:07:55.639 2 DEBUG oslo_concurrency.lockutils [None req-830edac4-90de-4293-bf97-7f101cb0db31 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "c5470fba-81f4-4592-8b40-1027a4dc1c83" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:07:55 compute-0 nova_compute[192698]: 2025-10-01 14:07:55.639 2 DEBUG oslo_concurrency.lockutils [None req-830edac4-90de-4293-bf97-7f101cb0db31 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Acquiring lock "c5470fba-81f4-4592-8b40-1027a4dc1c83-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:07:55 compute-0 nova_compute[192698]: 2025-10-01 14:07:55.639 2 DEBUG oslo_concurrency.lockutils [None req-830edac4-90de-4293-bf97-7f101cb0db31 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "c5470fba-81f4-4592-8b40-1027a4dc1c83-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:07:55 compute-0 nova_compute[192698]: 2025-10-01 14:07:55.639 2 DEBUG oslo_concurrency.lockutils [None req-830edac4-90de-4293-bf97-7f101cb0db31 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "c5470fba-81f4-4592-8b40-1027a4dc1c83-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:07:55 compute-0 nova_compute[192698]: 2025-10-01 14:07:55.653 2 INFO nova.compute.manager [None req-830edac4-90de-4293-bf97-7f101cb0db31 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: c5470fba-81f4-4592-8b40-1027a4dc1c83] Terminating instance
Oct 01 14:07:56 compute-0 nova_compute[192698]: 2025-10-01 14:07:56.168 2 DEBUG nova.compute.manager [None req-830edac4-90de-4293-bf97-7f101cb0db31 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: c5470fba-81f4-4592-8b40-1027a4dc1c83] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 01 14:07:56 compute-0 kernel: tapbae9bb47-22 (unregistering): left promiscuous mode
Oct 01 14:07:56 compute-0 NetworkManager[51741]: <info>  [1759327676.2133] device (tapbae9bb47-22): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 01 14:07:56 compute-0 ovn_controller[94909]: 2025-10-01T14:07:56Z|00081|binding|INFO|Releasing lport bae9bb47-22fa-49ee-9b7e-fc3a13b33880 from this chassis (sb_readonly=0)
Oct 01 14:07:56 compute-0 ovn_controller[94909]: 2025-10-01T14:07:56Z|00082|binding|INFO|Setting lport bae9bb47-22fa-49ee-9b7e-fc3a13b33880 down in Southbound
Oct 01 14:07:56 compute-0 nova_compute[192698]: 2025-10-01 14:07:56.226 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:56 compute-0 ovn_controller[94909]: 2025-10-01T14:07:56Z|00083|binding|INFO|Removing iface tapbae9bb47-22 ovn-installed in OVS
Oct 01 14:07:56 compute-0 nova_compute[192698]: 2025-10-01 14:07:56.228 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:56 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:56.237 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1d:07:29 10.100.0.10'], port_security=['fa:16:3e:1d:07:29 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'c5470fba-81f4-4592-8b40-1027a4dc1c83', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e35f096a-fd75-4d70-ae58-8a76ae666b9d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '67079b4774294271895bbf7b04f602e7', 'neutron:revision_number': '15', 'neutron:security_group_ids': 'de7872a1-1f76-4b0f-8bd9-119520ff7a88', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e3a455d-1f77-441e-b08a-0ec8231910e5, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], logical_port=bae9bb47-22fa-49ee-9b7e-fc3a13b33880) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:07:56 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:56.241 103791 INFO neutron.agent.ovn.metadata.agent [-] Port bae9bb47-22fa-49ee-9b7e-fc3a13b33880 in datapath e35f096a-fd75-4d70-ae58-8a76ae666b9d unbound from our chassis
Oct 01 14:07:56 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:56.243 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e35f096a-fd75-4d70-ae58-8a76ae666b9d
Oct 01 14:07:56 compute-0 nova_compute[192698]: 2025-10-01 14:07:56.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:56 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:56.274 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[d5679b74-ddb1-4cfe-b922-dc9e1cd5ac72]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:07:56 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000006.scope: Deactivated successfully.
Oct 01 14:07:56 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000006.scope: Consumed 4.111s CPU time.
Oct 01 14:07:56 compute-0 systemd-machined[152704]: Machine qemu-5-instance-00000006 terminated.
Oct 01 14:07:56 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:56.323 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[2c8c6fc5-a1ca-4e19-bdf5-3b161bb1e0dd]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:07:56 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:56.327 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[c6ca8670-cdcf-490c-a75f-0830f323e8b7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:07:56 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:56.374 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[6fe3ce52-154d-4354-9f3f-a6bb8ffa6e7e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:07:56 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:56.440 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[a4a94e42-2250-4b7e-869f-d824a685f1c8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape35f096a-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:47:1b:50'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 36, 'tx_packets': 20, 'rx_bytes': 2008, 'tx_bytes': 1032, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 36, 'tx_packets': 20, 'rx_bytes': 2008, 'tx_bytes': 1032, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 382912, 'reachable_time': 20041, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217886, 'error': None, 'target': 'ovnmeta-e35f096a-fd75-4d70-ae58-8a76ae666b9d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:07:56 compute-0 nova_compute[192698]: 2025-10-01 14:07:56.441 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:56 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:56.464 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[6de06e1b-f8e5-4ce1-bb0a-eccce06dd09a]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tape35f096a-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 382931, 'tstamp': 382931}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217896, 'error': None, 'target': 'ovnmeta-e35f096a-fd75-4d70-ae58-8a76ae666b9d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tape35f096a-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 382936, 'tstamp': 382936}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217896, 'error': None, 'target': 'ovnmeta-e35f096a-fd75-4d70-ae58-8a76ae666b9d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:07:56 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:56.466 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape35f096a-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:07:56 compute-0 nova_compute[192698]: 2025-10-01 14:07:56.468 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:56 compute-0 nova_compute[192698]: 2025-10-01 14:07:56.474 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:56 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:56.475 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape35f096a-f0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:07:56 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:56.476 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:07:56 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:56.476 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape35f096a-f0, col_values=(('external_ids', {'iface-id': '3f9111f1-79b1-4bf1-bb95-d924c71fb42c'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:07:56 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:56.477 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:07:56 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:07:56.478 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[858acd03-d277-4b01-9129-bcdb79ffbe88]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-e35f096a-fd75-4d70-ae58-8a76ae666b9d\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/e35f096a-fd75-4d70-ae58-8a76ae666b9d.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID e35f096a-fd75-4d70-ae58-8a76ae666b9d\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:07:56 compute-0 nova_compute[192698]: 2025-10-01 14:07:56.495 2 INFO nova.virt.libvirt.driver [-] [instance: c5470fba-81f4-4592-8b40-1027a4dc1c83] Instance destroyed successfully.
Oct 01 14:07:56 compute-0 nova_compute[192698]: 2025-10-01 14:07:56.495 2 DEBUG nova.objects.instance [None req-830edac4-90de-4293-bf97-7f101cb0db31 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lazy-loading 'resources' on Instance uuid c5470fba-81f4-4592-8b40-1027a4dc1c83 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:07:57 compute-0 nova_compute[192698]: 2025-10-01 14:07:57.004 2 DEBUG nova.virt.libvirt.vif [None req-830edac4-90de-4293-bf97-7f101cb0db31 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-10-01T14:05:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1628220963',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1628220963',id=6,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-01T14:05:31Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='67079b4774294271895bbf7b04f602e7',ramdisk_id='',reservation_id='r-dq548l48',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',clean_attempts='1',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-2075848047',owner_user_name='tempest-TestExecuteActionsViaActuator-2075848047-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-01T14:07:19Z,user_data=None,user_id='82619989ef1f48a39f1c1e7d64e4cb38',uuid=c5470fba-81f4-4592-8b40-1027a4dc1c83,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bae9bb47-22fa-49ee-9b7e-fc3a13b33880", "address": "fa:16:3e:1d:07:29", "network": {"id": "e35f096a-fd75-4d70-ae58-8a76ae666b9d", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1299231587-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b14b3910fae84828afa468e1e645402b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbae9bb47-22", "ovs_interfaceid": "bae9bb47-22fa-49ee-9b7e-fc3a13b33880", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 01 14:07:57 compute-0 nova_compute[192698]: 2025-10-01 14:07:57.005 2 DEBUG nova.network.os_vif_util [None req-830edac4-90de-4293-bf97-7f101cb0db31 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Converting VIF {"id": "bae9bb47-22fa-49ee-9b7e-fc3a13b33880", "address": "fa:16:3e:1d:07:29", "network": {"id": "e35f096a-fd75-4d70-ae58-8a76ae666b9d", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1299231587-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b14b3910fae84828afa468e1e645402b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbae9bb47-22", "ovs_interfaceid": "bae9bb47-22fa-49ee-9b7e-fc3a13b33880", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:07:57 compute-0 nova_compute[192698]: 2025-10-01 14:07:57.006 2 DEBUG nova.network.os_vif_util [None req-830edac4-90de-4293-bf97-7f101cb0db31 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1d:07:29,bridge_name='br-int',has_traffic_filtering=True,id=bae9bb47-22fa-49ee-9b7e-fc3a13b33880,network=Network(e35f096a-fd75-4d70-ae58-8a76ae666b9d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbae9bb47-22') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:07:57 compute-0 nova_compute[192698]: 2025-10-01 14:07:57.006 2 DEBUG os_vif [None req-830edac4-90de-4293-bf97-7f101cb0db31 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1d:07:29,bridge_name='br-int',has_traffic_filtering=True,id=bae9bb47-22fa-49ee-9b7e-fc3a13b33880,network=Network(e35f096a-fd75-4d70-ae58-8a76ae666b9d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbae9bb47-22') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 01 14:07:57 compute-0 nova_compute[192698]: 2025-10-01 14:07:57.008 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:57 compute-0 nova_compute[192698]: 2025-10-01 14:07:57.009 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbae9bb47-22, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:07:57 compute-0 nova_compute[192698]: 2025-10-01 14:07:57.011 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:57 compute-0 nova_compute[192698]: 2025-10-01 14:07:57.014 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:07:57 compute-0 nova_compute[192698]: 2025-10-01 14:07:57.016 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:57 compute-0 nova_compute[192698]: 2025-10-01 14:07:57.016 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=477eab30-b842-48e3-b3a3-d66fb7d45b53) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:07:57 compute-0 nova_compute[192698]: 2025-10-01 14:07:57.017 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:57 compute-0 nova_compute[192698]: 2025-10-01 14:07:57.018 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:57 compute-0 nova_compute[192698]: 2025-10-01 14:07:57.022 2 INFO os_vif [None req-830edac4-90de-4293-bf97-7f101cb0db31 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1d:07:29,bridge_name='br-int',has_traffic_filtering=True,id=bae9bb47-22fa-49ee-9b7e-fc3a13b33880,network=Network(e35f096a-fd75-4d70-ae58-8a76ae666b9d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbae9bb47-22')
Oct 01 14:07:57 compute-0 nova_compute[192698]: 2025-10-01 14:07:57.023 2 INFO nova.virt.libvirt.driver [None req-830edac4-90de-4293-bf97-7f101cb0db31 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: c5470fba-81f4-4592-8b40-1027a4dc1c83] Deleting instance files /var/lib/nova/instances/c5470fba-81f4-4592-8b40-1027a4dc1c83_del
Oct 01 14:07:57 compute-0 nova_compute[192698]: 2025-10-01 14:07:57.023 2 INFO nova.virt.libvirt.driver [None req-830edac4-90de-4293-bf97-7f101cb0db31 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: c5470fba-81f4-4592-8b40-1027a4dc1c83] Deletion of /var/lib/nova/instances/c5470fba-81f4-4592-8b40-1027a4dc1c83_del complete
Oct 01 14:07:57 compute-0 nova_compute[192698]: 2025-10-01 14:07:57.124 2 DEBUG nova.compute.manager [req-86e32d51-0555-4c5a-9427-3da18a1b5499 req-8dce037b-3c30-42ed-835b-e9f0036bd9f8 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: c5470fba-81f4-4592-8b40-1027a4dc1c83] Received event network-vif-unplugged-bae9bb47-22fa-49ee-9b7e-fc3a13b33880 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:07:57 compute-0 nova_compute[192698]: 2025-10-01 14:07:57.125 2 DEBUG oslo_concurrency.lockutils [req-86e32d51-0555-4c5a-9427-3da18a1b5499 req-8dce037b-3c30-42ed-835b-e9f0036bd9f8 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "c5470fba-81f4-4592-8b40-1027a4dc1c83-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:07:57 compute-0 nova_compute[192698]: 2025-10-01 14:07:57.126 2 DEBUG oslo_concurrency.lockutils [req-86e32d51-0555-4c5a-9427-3da18a1b5499 req-8dce037b-3c30-42ed-835b-e9f0036bd9f8 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "c5470fba-81f4-4592-8b40-1027a4dc1c83-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:07:57 compute-0 nova_compute[192698]: 2025-10-01 14:07:57.126 2 DEBUG oslo_concurrency.lockutils [req-86e32d51-0555-4c5a-9427-3da18a1b5499 req-8dce037b-3c30-42ed-835b-e9f0036bd9f8 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "c5470fba-81f4-4592-8b40-1027a4dc1c83-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:07:57 compute-0 nova_compute[192698]: 2025-10-01 14:07:57.126 2 DEBUG nova.compute.manager [req-86e32d51-0555-4c5a-9427-3da18a1b5499 req-8dce037b-3c30-42ed-835b-e9f0036bd9f8 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: c5470fba-81f4-4592-8b40-1027a4dc1c83] No waiting events found dispatching network-vif-unplugged-bae9bb47-22fa-49ee-9b7e-fc3a13b33880 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:07:57 compute-0 nova_compute[192698]: 2025-10-01 14:07:57.127 2 DEBUG nova.compute.manager [req-86e32d51-0555-4c5a-9427-3da18a1b5499 req-8dce037b-3c30-42ed-835b-e9f0036bd9f8 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: c5470fba-81f4-4592-8b40-1027a4dc1c83] Received event network-vif-unplugged-bae9bb47-22fa-49ee-9b7e-fc3a13b33880 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 01 14:07:57 compute-0 nova_compute[192698]: 2025-10-01 14:07:57.543 2 INFO nova.compute.manager [None req-830edac4-90de-4293-bf97-7f101cb0db31 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: c5470fba-81f4-4592-8b40-1027a4dc1c83] Took 1.37 seconds to destroy the instance on the hypervisor.
Oct 01 14:07:57 compute-0 nova_compute[192698]: 2025-10-01 14:07:57.544 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-830edac4-90de-4293-bf97-7f101cb0db31 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 01 14:07:57 compute-0 nova_compute[192698]: 2025-10-01 14:07:57.545 2 DEBUG nova.compute.manager [-] [instance: c5470fba-81f4-4592-8b40-1027a4dc1c83] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 01 14:07:57 compute-0 nova_compute[192698]: 2025-10-01 14:07:57.545 2 DEBUG nova.network.neutron [-] [instance: c5470fba-81f4-4592-8b40-1027a4dc1c83] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 01 14:07:57 compute-0 nova_compute[192698]: 2025-10-01 14:07:57.545 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:07:57 compute-0 nova_compute[192698]: 2025-10-01 14:07:57.903 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:07:58 compute-0 nova_compute[192698]: 2025-10-01 14:07:58.034 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:07:58 compute-0 nova_compute[192698]: 2025-10-01 14:07:58.367 2 DEBUG nova.compute.manager [req-65f0a6ea-36c3-4a5a-bd12-c113a4663c6e req-6a2dc0f8-8651-48ed-a158-049ed75843b3 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: c5470fba-81f4-4592-8b40-1027a4dc1c83] Received event network-vif-deleted-bae9bb47-22fa-49ee-9b7e-fc3a13b33880 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:07:58 compute-0 nova_compute[192698]: 2025-10-01 14:07:58.368 2 INFO nova.compute.manager [req-65f0a6ea-36c3-4a5a-bd12-c113a4663c6e req-6a2dc0f8-8651-48ed-a158-049ed75843b3 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: c5470fba-81f4-4592-8b40-1027a4dc1c83] Neutron deleted interface bae9bb47-22fa-49ee-9b7e-fc3a13b33880; detaching it from the instance and deleting it from the info cache
Oct 01 14:07:58 compute-0 nova_compute[192698]: 2025-10-01 14:07:58.368 2 DEBUG nova.network.neutron [req-65f0a6ea-36c3-4a5a-bd12-c113a4663c6e req-6a2dc0f8-8651-48ed-a158-049ed75843b3 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: c5470fba-81f4-4592-8b40-1027a4dc1c83] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:07:58 compute-0 nova_compute[192698]: 2025-10-01 14:07:58.818 2 DEBUG nova.network.neutron [-] [instance: c5470fba-81f4-4592-8b40-1027a4dc1c83] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:07:58 compute-0 nova_compute[192698]: 2025-10-01 14:07:58.877 2 DEBUG nova.compute.manager [req-65f0a6ea-36c3-4a5a-bd12-c113a4663c6e req-6a2dc0f8-8651-48ed-a158-049ed75843b3 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: c5470fba-81f4-4592-8b40-1027a4dc1c83] Detach interface failed, port_id=bae9bb47-22fa-49ee-9b7e-fc3a13b33880, reason: Instance c5470fba-81f4-4592-8b40-1027a4dc1c83 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Oct 01 14:07:59 compute-0 nova_compute[192698]: 2025-10-01 14:07:59.190 2 DEBUG nova.compute.manager [req-450463fd-32c3-4f48-b57b-841192e8ea37 req-07f4c2d0-707f-41ce-8791-e51ae31f8281 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: c5470fba-81f4-4592-8b40-1027a4dc1c83] Received event network-vif-unplugged-bae9bb47-22fa-49ee-9b7e-fc3a13b33880 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:07:59 compute-0 nova_compute[192698]: 2025-10-01 14:07:59.190 2 DEBUG oslo_concurrency.lockutils [req-450463fd-32c3-4f48-b57b-841192e8ea37 req-07f4c2d0-707f-41ce-8791-e51ae31f8281 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "c5470fba-81f4-4592-8b40-1027a4dc1c83-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:07:59 compute-0 nova_compute[192698]: 2025-10-01 14:07:59.191 2 DEBUG oslo_concurrency.lockutils [req-450463fd-32c3-4f48-b57b-841192e8ea37 req-07f4c2d0-707f-41ce-8791-e51ae31f8281 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "c5470fba-81f4-4592-8b40-1027a4dc1c83-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:07:59 compute-0 nova_compute[192698]: 2025-10-01 14:07:59.191 2 DEBUG oslo_concurrency.lockutils [req-450463fd-32c3-4f48-b57b-841192e8ea37 req-07f4c2d0-707f-41ce-8791-e51ae31f8281 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "c5470fba-81f4-4592-8b40-1027a4dc1c83-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:07:59 compute-0 nova_compute[192698]: 2025-10-01 14:07:59.192 2 DEBUG nova.compute.manager [req-450463fd-32c3-4f48-b57b-841192e8ea37 req-07f4c2d0-707f-41ce-8791-e51ae31f8281 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: c5470fba-81f4-4592-8b40-1027a4dc1c83] No waiting events found dispatching network-vif-unplugged-bae9bb47-22fa-49ee-9b7e-fc3a13b33880 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:07:59 compute-0 nova_compute[192698]: 2025-10-01 14:07:59.192 2 DEBUG nova.compute.manager [req-450463fd-32c3-4f48-b57b-841192e8ea37 req-07f4c2d0-707f-41ce-8791-e51ae31f8281 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: c5470fba-81f4-4592-8b40-1027a4dc1c83] Received event network-vif-unplugged-bae9bb47-22fa-49ee-9b7e-fc3a13b33880 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 01 14:07:59 compute-0 nova_compute[192698]: 2025-10-01 14:07:59.327 2 INFO nova.compute.manager [-] [instance: c5470fba-81f4-4592-8b40-1027a4dc1c83] Took 1.78 seconds to deallocate network for instance.
Oct 01 14:07:59 compute-0 podman[203144]: time="2025-10-01T14:07:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:07:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:07:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20750 "" "Go-http-client/1.1"
Oct 01 14:07:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:07:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3477 "" "Go-http-client/1.1"
Oct 01 14:07:59 compute-0 nova_compute[192698]: 2025-10-01 14:07:59.855 2 DEBUG oslo_concurrency.lockutils [None req-830edac4-90de-4293-bf97-7f101cb0db31 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:07:59 compute-0 nova_compute[192698]: 2025-10-01 14:07:59.856 2 DEBUG oslo_concurrency.lockutils [None req-830edac4-90de-4293-bf97-7f101cb0db31 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:07:59 compute-0 nova_compute[192698]: 2025-10-01 14:07:59.861 2 DEBUG oslo_concurrency.lockutils [None req-830edac4-90de-4293-bf97-7f101cb0db31 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.005s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:07:59 compute-0 nova_compute[192698]: 2025-10-01 14:07:59.894 2 INFO nova.scheduler.client.report [None req-830edac4-90de-4293-bf97-7f101cb0db31 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Deleted allocations for instance c5470fba-81f4-4592-8b40-1027a4dc1c83
Oct 01 14:08:00 compute-0 nova_compute[192698]: 2025-10-01 14:08:00.948 2 DEBUG oslo_concurrency.lockutils [None req-830edac4-90de-4293-bf97-7f101cb0db31 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "c5470fba-81f4-4592-8b40-1027a4dc1c83" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.309s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:08:01 compute-0 podman[217904]: 2025-10-01 14:08:01.196640861 +0000 UTC m=+0.103256318 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=iscsid)
Oct 01 14:08:01 compute-0 podman[217905]: 2025-10-01 14:08:01.208400197 +0000 UTC m=+0.108205251 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct 01 14:08:01 compute-0 openstack_network_exporter[205307]: ERROR   14:08:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:08:01 compute-0 openstack_network_exporter[205307]: ERROR   14:08:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:08:01 compute-0 openstack_network_exporter[205307]: ERROR   14:08:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:08:01 compute-0 openstack_network_exporter[205307]: ERROR   14:08:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:08:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:08:01 compute-0 openstack_network_exporter[205307]: ERROR   14:08:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:08:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:08:02 compute-0 nova_compute[192698]: 2025-10-01 14:08:02.017 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:08:02 compute-0 nova_compute[192698]: 2025-10-01 14:08:02.514 2 DEBUG oslo_concurrency.lockutils [None req-d36f11e9-7b1f-45a4-bad7-f1076083db4b 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Acquiring lock "28407011-1056-4714-96fc-1e8904bbcf1f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:08:02 compute-0 nova_compute[192698]: 2025-10-01 14:08:02.515 2 DEBUG oslo_concurrency.lockutils [None req-d36f11e9-7b1f-45a4-bad7-f1076083db4b 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "28407011-1056-4714-96fc-1e8904bbcf1f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:08:02 compute-0 nova_compute[192698]: 2025-10-01 14:08:02.515 2 DEBUG oslo_concurrency.lockutils [None req-d36f11e9-7b1f-45a4-bad7-f1076083db4b 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Acquiring lock "28407011-1056-4714-96fc-1e8904bbcf1f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:08:02 compute-0 nova_compute[192698]: 2025-10-01 14:08:02.516 2 DEBUG oslo_concurrency.lockutils [None req-d36f11e9-7b1f-45a4-bad7-f1076083db4b 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "28407011-1056-4714-96fc-1e8904bbcf1f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:08:02 compute-0 nova_compute[192698]: 2025-10-01 14:08:02.516 2 DEBUG oslo_concurrency.lockutils [None req-d36f11e9-7b1f-45a4-bad7-f1076083db4b 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "28407011-1056-4714-96fc-1e8904bbcf1f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:08:02 compute-0 nova_compute[192698]: 2025-10-01 14:08:02.532 2 INFO nova.compute.manager [None req-d36f11e9-7b1f-45a4-bad7-f1076083db4b 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] Terminating instance
Oct 01 14:08:02 compute-0 nova_compute[192698]: 2025-10-01 14:08:02.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:08:02 compute-0 nova_compute[192698]: 2025-10-01 14:08:02.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:08:02 compute-0 nova_compute[192698]: 2025-10-01 14:08:02.926 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Oct 01 14:08:03 compute-0 nova_compute[192698]: 2025-10-01 14:08:03.051 2 DEBUG nova.compute.manager [None req-d36f11e9-7b1f-45a4-bad7-f1076083db4b 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 01 14:08:03 compute-0 kernel: tap1a9d8f85-cd (unregistering): left promiscuous mode
Oct 01 14:08:03 compute-0 NetworkManager[51741]: <info>  [1759327683.0799] device (tap1a9d8f85-cd): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 01 14:08:03 compute-0 ovn_controller[94909]: 2025-10-01T14:08:03Z|00084|binding|INFO|Releasing lport 1a9d8f85-cd26-4e65-b316-4dbc35e89aca from this chassis (sb_readonly=0)
Oct 01 14:08:03 compute-0 ovn_controller[94909]: 2025-10-01T14:08:03Z|00085|binding|INFO|Setting lport 1a9d8f85-cd26-4e65-b316-4dbc35e89aca down in Southbound
Oct 01 14:08:03 compute-0 ovn_controller[94909]: 2025-10-01T14:08:03Z|00086|binding|INFO|Removing iface tap1a9d8f85-cd ovn-installed in OVS
Oct 01 14:08:03 compute-0 nova_compute[192698]: 2025-10-01 14:08:03.096 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:08:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:08:03.104 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:17:68:4f 10.100.0.3'], port_security=['fa:16:3e:17:68:4f 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '28407011-1056-4714-96fc-1e8904bbcf1f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e35f096a-fd75-4d70-ae58-8a76ae666b9d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '67079b4774294271895bbf7b04f602e7', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'de7872a1-1f76-4b0f-8bd9-119520ff7a88', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e3a455d-1f77-441e-b08a-0ec8231910e5, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], logical_port=1a9d8f85-cd26-4e65-b316-4dbc35e89aca) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:08:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:08:03.106 103791 INFO neutron.agent.ovn.metadata.agent [-] Port 1a9d8f85-cd26-4e65-b316-4dbc35e89aca in datapath e35f096a-fd75-4d70-ae58-8a76ae666b9d unbound from our chassis
Oct 01 14:08:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:08:03.109 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e35f096a-fd75-4d70-ae58-8a76ae666b9d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 01 14:08:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:08:03.112 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[6caa4750-ca14-44df-b5bf-256658b0ae1f]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:08:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:08:03.113 103791 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e35f096a-fd75-4d70-ae58-8a76ae666b9d namespace which is not needed anymore
Oct 01 14:08:03 compute-0 nova_compute[192698]: 2025-10-01 14:08:03.128 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:08:03 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000005.scope: Deactivated successfully.
Oct 01 14:08:03 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000005.scope: Consumed 23.162s CPU time.
Oct 01 14:08:03 compute-0 systemd-machined[152704]: Machine qemu-2-instance-00000005 terminated.
Oct 01 14:08:03 compute-0 podman[217971]: 2025-10-01 14:08:03.296909414 +0000 UTC m=+0.055110673 container kill 0aa86536455b16e34aad8f5f0a422ec17305f86e51f113097ce8e68d15c93360 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-e35f096a-fd75-4d70-ae58-8a76ae666b9d, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930)
Oct 01 14:08:03 compute-0 neutron-haproxy-ovnmeta-e35f096a-fd75-4d70-ae58-8a76ae666b9d[216462]: [NOTICE]   (216466) : haproxy version is 3.0.5-8e879a5
Oct 01 14:08:03 compute-0 neutron-haproxy-ovnmeta-e35f096a-fd75-4d70-ae58-8a76ae666b9d[216462]: [NOTICE]   (216466) : path to executable is /usr/sbin/haproxy
Oct 01 14:08:03 compute-0 neutron-haproxy-ovnmeta-e35f096a-fd75-4d70-ae58-8a76ae666b9d[216462]: [WARNING]  (216466) : Exiting Master process...
Oct 01 14:08:03 compute-0 neutron-haproxy-ovnmeta-e35f096a-fd75-4d70-ae58-8a76ae666b9d[216462]: [ALERT]    (216466) : Current worker (216468) exited with code 143 (Terminated)
Oct 01 14:08:03 compute-0 neutron-haproxy-ovnmeta-e35f096a-fd75-4d70-ae58-8a76ae666b9d[216462]: [WARNING]  (216466) : All workers exited. Exiting... (0)
Oct 01 14:08:03 compute-0 systemd[1]: libpod-0aa86536455b16e34aad8f5f0a422ec17305f86e51f113097ce8e68d15c93360.scope: Deactivated successfully.
Oct 01 14:08:03 compute-0 nova_compute[192698]: 2025-10-01 14:08:03.352 2 INFO nova.virt.libvirt.driver [-] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] Instance destroyed successfully.
Oct 01 14:08:03 compute-0 nova_compute[192698]: 2025-10-01 14:08:03.353 2 DEBUG nova.objects.instance [None req-d36f11e9-7b1f-45a4-bad7-f1076083db4b 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lazy-loading 'resources' on Instance uuid 28407011-1056-4714-96fc-1e8904bbcf1f obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:08:03 compute-0 podman[218001]: 2025-10-01 14:08:03.381454098 +0000 UTC m=+0.033387859 container died 0aa86536455b16e34aad8f5f0a422ec17305f86e51f113097ce8e68d15c93360 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-e35f096a-fd75-4d70-ae58-8a76ae666b9d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0)
Oct 01 14:08:03 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0aa86536455b16e34aad8f5f0a422ec17305f86e51f113097ce8e68d15c93360-userdata-shm.mount: Deactivated successfully.
Oct 01 14:08:03 compute-0 systemd[1]: var-lib-containers-storage-overlay-0174d47f382f2361024b4c63aef3c38862dee1d030ae143933ff05834db39753-merged.mount: Deactivated successfully.
Oct 01 14:08:03 compute-0 podman[218001]: 2025-10-01 14:08:03.445066049 +0000 UTC m=+0.096999810 container remove 0aa86536455b16e34aad8f5f0a422ec17305f86e51f113097ce8e68d15c93360 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-e35f096a-fd75-4d70-ae58-8a76ae666b9d, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_managed=true)
Oct 01 14:08:03 compute-0 systemd[1]: libpod-conmon-0aa86536455b16e34aad8f5f0a422ec17305f86e51f113097ce8e68d15c93360.scope: Deactivated successfully.
Oct 01 14:08:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:08:03.457 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[681d2ebc-713e-431a-8a0a-efb4ed748051]: (4, ("Wed Oct  1 02:08:03 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-e35f096a-fd75-4d70-ae58-8a76ae666b9d (0aa86536455b16e34aad8f5f0a422ec17305f86e51f113097ce8e68d15c93360)\n0aa86536455b16e34aad8f5f0a422ec17305f86e51f113097ce8e68d15c93360\nWed Oct  1 02:08:03 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-e35f096a-fd75-4d70-ae58-8a76ae666b9d (0aa86536455b16e34aad8f5f0a422ec17305f86e51f113097ce8e68d15c93360)\n0aa86536455b16e34aad8f5f0a422ec17305f86e51f113097ce8e68d15c93360\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:08:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:08:03.459 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[948ba2cc-5c83-4e3d-94da-6a1974dac9ff]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:08:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:08:03.459 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e35f096a-fd75-4d70-ae58-8a76ae666b9d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e35f096a-fd75-4d70-ae58-8a76ae666b9d.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:08:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:08:03.460 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[a6a97adf-ea44-429a-a9ae-2bfadf48e0e4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:08:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:08:03.461 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape35f096a-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:08:03 compute-0 nova_compute[192698]: 2025-10-01 14:08:03.463 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:08:03 compute-0 kernel: tape35f096a-f0: left promiscuous mode
Oct 01 14:08:03 compute-0 nova_compute[192698]: 2025-10-01 14:08:03.500 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:08:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:08:03.504 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[8fc4ae03-0099-44f6-b24c-a2391d4830ec]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:08:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:08:03.528 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[213ffaea-3970-4ee9-9b27-60005450f806]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:08:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:08:03.529 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[67dfa72d-3107-4125-ad40-434a44eb605a]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:08:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:08:03.553 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[6c8269af-4a98-4b4a-8c05-ba192e34173a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 382899, 'reachable_time': 23713, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218034, 'error': None, 'target': 'ovnmeta-e35f096a-fd75-4d70-ae58-8a76ae666b9d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:08:03 compute-0 systemd[1]: run-netns-ovnmeta\x2de35f096a\x2dfd75\x2d4d70\x2dae58\x2d8a76ae666b9d.mount: Deactivated successfully.
Oct 01 14:08:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:08:03.559 103910 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e35f096a-fd75-4d70-ae58-8a76ae666b9d deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Oct 01 14:08:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:08:03.561 103910 DEBUG oslo.privsep.daemon [-] privsep: reply[c74fa507-f208-499e-ae15-4cdb4f6e64e1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:08:03 compute-0 nova_compute[192698]: 2025-10-01 14:08:03.859 2 DEBUG nova.virt.libvirt.vif [None req-d36f11e9-7b1f-45a4-bad7-f1076083db4b 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-10-01T14:04:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-757789601',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-757789601',id=5,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-01T14:04:30Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='67079b4774294271895bbf7b04f602e7',ramdisk_id='',reservation_id='r-f6mvgq4u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-2075848047',owner_user_name='tempest-TestExecuteActionsViaActuator-2075848047-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-01T14:04:30Z,user_data=None,user_id='82619989ef1f48a39f1c1e7d64e4cb38',uuid=28407011-1056-4714-96fc-1e8904bbcf1f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1a9d8f85-cd26-4e65-b316-4dbc35e89aca", "address": "fa:16:3e:17:68:4f", "network": {"id": "e35f096a-fd75-4d70-ae58-8a76ae666b9d", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1299231587-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b14b3910fae84828afa468e1e645402b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a9d8f85-cd", "ovs_interfaceid": "1a9d8f85-cd26-4e65-b316-4dbc35e89aca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 01 14:08:03 compute-0 nova_compute[192698]: 2025-10-01 14:08:03.860 2 DEBUG nova.network.os_vif_util [None req-d36f11e9-7b1f-45a4-bad7-f1076083db4b 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Converting VIF {"id": "1a9d8f85-cd26-4e65-b316-4dbc35e89aca", "address": "fa:16:3e:17:68:4f", "network": {"id": "e35f096a-fd75-4d70-ae58-8a76ae666b9d", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1299231587-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b14b3910fae84828afa468e1e645402b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a9d8f85-cd", "ovs_interfaceid": "1a9d8f85-cd26-4e65-b316-4dbc35e89aca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:08:03 compute-0 nova_compute[192698]: 2025-10-01 14:08:03.861 2 DEBUG nova.network.os_vif_util [None req-d36f11e9-7b1f-45a4-bad7-f1076083db4b 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:17:68:4f,bridge_name='br-int',has_traffic_filtering=True,id=1a9d8f85-cd26-4e65-b316-4dbc35e89aca,network=Network(e35f096a-fd75-4d70-ae58-8a76ae666b9d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1a9d8f85-cd') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:08:03 compute-0 nova_compute[192698]: 2025-10-01 14:08:03.862 2 DEBUG os_vif [None req-d36f11e9-7b1f-45a4-bad7-f1076083db4b 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:17:68:4f,bridge_name='br-int',has_traffic_filtering=True,id=1a9d8f85-cd26-4e65-b316-4dbc35e89aca,network=Network(e35f096a-fd75-4d70-ae58-8a76ae666b9d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1a9d8f85-cd') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 01 14:08:03 compute-0 nova_compute[192698]: 2025-10-01 14:08:03.864 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:08:03 compute-0 nova_compute[192698]: 2025-10-01 14:08:03.865 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1a9d8f85-cd, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:08:03 compute-0 nova_compute[192698]: 2025-10-01 14:08:03.867 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:08:03 compute-0 nova_compute[192698]: 2025-10-01 14:08:03.870 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:08:03 compute-0 nova_compute[192698]: 2025-10-01 14:08:03.871 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:08:03 compute-0 nova_compute[192698]: 2025-10-01 14:08:03.872 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=b3b25e8c-1058-4fb0-9d1b-fca029736e41) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:08:03 compute-0 nova_compute[192698]: 2025-10-01 14:08:03.873 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:08:03 compute-0 nova_compute[192698]: 2025-10-01 14:08:03.875 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:08:03 compute-0 nova_compute[192698]: 2025-10-01 14:08:03.878 2 INFO os_vif [None req-d36f11e9-7b1f-45a4-bad7-f1076083db4b 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:17:68:4f,bridge_name='br-int',has_traffic_filtering=True,id=1a9d8f85-cd26-4e65-b316-4dbc35e89aca,network=Network(e35f096a-fd75-4d70-ae58-8a76ae666b9d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1a9d8f85-cd')
Oct 01 14:08:03 compute-0 nova_compute[192698]: 2025-10-01 14:08:03.879 2 INFO nova.virt.libvirt.driver [None req-d36f11e9-7b1f-45a4-bad7-f1076083db4b 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] Deleting instance files /var/lib/nova/instances/28407011-1056-4714-96fc-1e8904bbcf1f_del
Oct 01 14:08:03 compute-0 nova_compute[192698]: 2025-10-01 14:08:03.880 2 INFO nova.virt.libvirt.driver [None req-d36f11e9-7b1f-45a4-bad7-f1076083db4b 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] Deletion of /var/lib/nova/instances/28407011-1056-4714-96fc-1e8904bbcf1f_del complete
Oct 01 14:08:04 compute-0 nova_compute[192698]: 2025-10-01 14:08:04.140 2 DEBUG nova.compute.manager [req-aba24dc3-2d80-4302-92cc-46676f003165 req-bc2ca733-c735-4cf8-9f19-fdc545bcbdb8 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] Received event network-vif-unplugged-1a9d8f85-cd26-4e65-b316-4dbc35e89aca external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:08:04 compute-0 nova_compute[192698]: 2025-10-01 14:08:04.141 2 DEBUG oslo_concurrency.lockutils [req-aba24dc3-2d80-4302-92cc-46676f003165 req-bc2ca733-c735-4cf8-9f19-fdc545bcbdb8 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "28407011-1056-4714-96fc-1e8904bbcf1f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:08:04 compute-0 nova_compute[192698]: 2025-10-01 14:08:04.141 2 DEBUG oslo_concurrency.lockutils [req-aba24dc3-2d80-4302-92cc-46676f003165 req-bc2ca733-c735-4cf8-9f19-fdc545bcbdb8 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "28407011-1056-4714-96fc-1e8904bbcf1f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:08:04 compute-0 nova_compute[192698]: 2025-10-01 14:08:04.142 2 DEBUG oslo_concurrency.lockutils [req-aba24dc3-2d80-4302-92cc-46676f003165 req-bc2ca733-c735-4cf8-9f19-fdc545bcbdb8 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "28407011-1056-4714-96fc-1e8904bbcf1f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:08:04 compute-0 nova_compute[192698]: 2025-10-01 14:08:04.142 2 DEBUG nova.compute.manager [req-aba24dc3-2d80-4302-92cc-46676f003165 req-bc2ca733-c735-4cf8-9f19-fdc545bcbdb8 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] No waiting events found dispatching network-vif-unplugged-1a9d8f85-cd26-4e65-b316-4dbc35e89aca pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:08:04 compute-0 nova_compute[192698]: 2025-10-01 14:08:04.142 2 DEBUG nova.compute.manager [req-aba24dc3-2d80-4302-92cc-46676f003165 req-bc2ca733-c735-4cf8-9f19-fdc545bcbdb8 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] Received event network-vif-unplugged-1a9d8f85-cd26-4e65-b316-4dbc35e89aca for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 01 14:08:04 compute-0 nova_compute[192698]: 2025-10-01 14:08:04.397 2 INFO nova.compute.manager [None req-d36f11e9-7b1f-45a4-bad7-f1076083db4b 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] Took 1.35 seconds to destroy the instance on the hypervisor.
Oct 01 14:08:04 compute-0 nova_compute[192698]: 2025-10-01 14:08:04.398 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-d36f11e9-7b1f-45a4-bad7-f1076083db4b 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 01 14:08:04 compute-0 nova_compute[192698]: 2025-10-01 14:08:04.399 2 DEBUG nova.compute.manager [-] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 01 14:08:04 compute-0 nova_compute[192698]: 2025-10-01 14:08:04.399 2 DEBUG nova.network.neutron [-] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 01 14:08:04 compute-0 nova_compute[192698]: 2025-10-01 14:08:04.399 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:08:05 compute-0 nova_compute[192698]: 2025-10-01 14:08:05.038 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:08:05 compute-0 nova_compute[192698]: 2025-10-01 14:08:05.499 2 DEBUG nova.compute.manager [req-6e17e35e-e9d9-494f-b5b3-6475db725083 req-153c0855-29af-44a4-a1a9-958f3321ae48 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] Received event network-vif-deleted-1a9d8f85-cd26-4e65-b316-4dbc35e89aca external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:08:05 compute-0 nova_compute[192698]: 2025-10-01 14:08:05.500 2 INFO nova.compute.manager [req-6e17e35e-e9d9-494f-b5b3-6475db725083 req-153c0855-29af-44a4-a1a9-958f3321ae48 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] Neutron deleted interface 1a9d8f85-cd26-4e65-b316-4dbc35e89aca; detaching it from the instance and deleting it from the info cache
Oct 01 14:08:05 compute-0 nova_compute[192698]: 2025-10-01 14:08:05.501 2 DEBUG nova.network.neutron [req-6e17e35e-e9d9-494f-b5b3-6475db725083 req-153c0855-29af-44a4-a1a9-958f3321ae48 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:08:05 compute-0 nova_compute[192698]: 2025-10-01 14:08:05.949 2 DEBUG nova.network.neutron [-] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:08:06 compute-0 nova_compute[192698]: 2025-10-01 14:08:06.011 2 DEBUG nova.compute.manager [req-6e17e35e-e9d9-494f-b5b3-6475db725083 req-153c0855-29af-44a4-a1a9-958f3321ae48 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] Detach interface failed, port_id=1a9d8f85-cd26-4e65-b316-4dbc35e89aca, reason: Instance 28407011-1056-4714-96fc-1e8904bbcf1f could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Oct 01 14:08:06 compute-0 nova_compute[192698]: 2025-10-01 14:08:06.192 2 DEBUG nova.compute.manager [req-d5ab4a8f-44ee-442f-8413-cb309c2107a4 req-c6e6af44-db50-433b-958e-01a5a5d25010 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] Received event network-vif-unplugged-1a9d8f85-cd26-4e65-b316-4dbc35e89aca external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:08:06 compute-0 nova_compute[192698]: 2025-10-01 14:08:06.193 2 DEBUG oslo_concurrency.lockutils [req-d5ab4a8f-44ee-442f-8413-cb309c2107a4 req-c6e6af44-db50-433b-958e-01a5a5d25010 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "28407011-1056-4714-96fc-1e8904bbcf1f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:08:06 compute-0 nova_compute[192698]: 2025-10-01 14:08:06.193 2 DEBUG oslo_concurrency.lockutils [req-d5ab4a8f-44ee-442f-8413-cb309c2107a4 req-c6e6af44-db50-433b-958e-01a5a5d25010 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "28407011-1056-4714-96fc-1e8904bbcf1f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:08:06 compute-0 nova_compute[192698]: 2025-10-01 14:08:06.193 2 DEBUG oslo_concurrency.lockutils [req-d5ab4a8f-44ee-442f-8413-cb309c2107a4 req-c6e6af44-db50-433b-958e-01a5a5d25010 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "28407011-1056-4714-96fc-1e8904bbcf1f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:08:06 compute-0 nova_compute[192698]: 2025-10-01 14:08:06.194 2 DEBUG nova.compute.manager [req-d5ab4a8f-44ee-442f-8413-cb309c2107a4 req-c6e6af44-db50-433b-958e-01a5a5d25010 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] No waiting events found dispatching network-vif-unplugged-1a9d8f85-cd26-4e65-b316-4dbc35e89aca pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:08:06 compute-0 nova_compute[192698]: 2025-10-01 14:08:06.194 2 DEBUG nova.compute.manager [req-d5ab4a8f-44ee-442f-8413-cb309c2107a4 req-c6e6af44-db50-433b-958e-01a5a5d25010 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] Received event network-vif-unplugged-1a9d8f85-cd26-4e65-b316-4dbc35e89aca for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 01 14:08:06 compute-0 nova_compute[192698]: 2025-10-01 14:08:06.464 2 INFO nova.compute.manager [-] [instance: 28407011-1056-4714-96fc-1e8904bbcf1f] Took 2.06 seconds to deallocate network for instance.
Oct 01 14:08:06 compute-0 nova_compute[192698]: 2025-10-01 14:08:06.990 2 DEBUG oslo_concurrency.lockutils [None req-d36f11e9-7b1f-45a4-bad7-f1076083db4b 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:08:06 compute-0 nova_compute[192698]: 2025-10-01 14:08:06.991 2 DEBUG oslo_concurrency.lockutils [None req-d36f11e9-7b1f-45a4-bad7-f1076083db4b 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:08:07 compute-0 nova_compute[192698]: 2025-10-01 14:08:07.025 2 DEBUG nova.scheduler.client.report [None req-d36f11e9-7b1f-45a4-bad7-f1076083db4b 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Refreshing inventories for resource provider ee1e54f5-453b-4949-a499-9a192f03b8f0 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Oct 01 14:08:07 compute-0 nova_compute[192698]: 2025-10-01 14:08:07.042 2 DEBUG nova.scheduler.client.report [None req-d36f11e9-7b1f-45a4-bad7-f1076083db4b 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Updating ProviderTree inventory for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Oct 01 14:08:07 compute-0 nova_compute[192698]: 2025-10-01 14:08:07.043 2 DEBUG nova.compute.provider_tree [None req-d36f11e9-7b1f-45a4-bad7-f1076083db4b 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Updating inventory in ProviderTree for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Oct 01 14:08:07 compute-0 nova_compute[192698]: 2025-10-01 14:08:07.062 2 DEBUG nova.scheduler.client.report [None req-d36f11e9-7b1f-45a4-bad7-f1076083db4b 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Refreshing aggregate associations for resource provider ee1e54f5-453b-4949-a499-9a192f03b8f0, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Oct 01 14:08:07 compute-0 nova_compute[192698]: 2025-10-01 14:08:07.082 2 DEBUG nova.scheduler.client.report [None req-d36f11e9-7b1f-45a4-bad7-f1076083db4b 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Refreshing trait associations for resource provider ee1e54f5-453b-4949-a499-9a192f03b8f0, traits: COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_TPM_TIS,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_BMI2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_AVX2,COMPUTE_ARCH_X86_64,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SHA,COMPUTE_SOUND_MODEL_AC97,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_SOUND_MODEL_ES1370,HW_ARCH_X86_64,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_DEVICE_TAGGING,COMPUTE_SECURITY_TPM_CRB,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SOUND_MODEL_SB16,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SOUND_MODEL_USB,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_ACCELERATORS,HW_CPU_X86_CLMUL,HW_CPU_X86_AESNI,COMPUTE_NODE,HW_CPU_X86_SSSE3,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_RESCUE_BFV,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_FMA3,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_AVX,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_F16C,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_ABM,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_VIRTIO_FS,HW_CPU_X86_SSE2,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE4A,HW_CPU_X86_SVM _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Oct 01 14:08:07 compute-0 nova_compute[192698]: 2025-10-01 14:08:07.114 2 DEBUG nova.compute.provider_tree [None req-d36f11e9-7b1f-45a4-bad7-f1076083db4b 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:08:07 compute-0 nova_compute[192698]: 2025-10-01 14:08:07.433 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:08:07 compute-0 nova_compute[192698]: 2025-10-01 14:08:07.622 2 DEBUG nova.scheduler.client.report [None req-d36f11e9-7b1f-45a4-bad7-f1076083db4b 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:08:07 compute-0 nova_compute[192698]: 2025-10-01 14:08:07.908 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:08:08 compute-0 nova_compute[192698]: 2025-10-01 14:08:08.140 2 DEBUG oslo_concurrency.lockutils [None req-d36f11e9-7b1f-45a4-bad7-f1076083db4b 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.149s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:08:08 compute-0 podman[218036]: 2025-10-01 14:08:08.175954736 +0000 UTC m=+0.085218903 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 01 14:08:08 compute-0 nova_compute[192698]: 2025-10-01 14:08:08.198 2 INFO nova.scheduler.client.report [None req-d36f11e9-7b1f-45a4-bad7-f1076083db4b 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Deleted allocations for instance 28407011-1056-4714-96fc-1e8904bbcf1f
Oct 01 14:08:08 compute-0 nova_compute[192698]: 2025-10-01 14:08:08.875 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:08:08 compute-0 nova_compute[192698]: 2025-10-01 14:08:08.924 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:08:08 compute-0 nova_compute[192698]: 2025-10-01 14:08:08.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:08:09 compute-0 nova_compute[192698]: 2025-10-01 14:08:09.240 2 DEBUG oslo_concurrency.lockutils [None req-d36f11e9-7b1f-45a4-bad7-f1076083db4b 82619989ef1f48a39f1c1e7d64e4cb38 67079b4774294271895bbf7b04f602e7 - - default default] Lock "28407011-1056-4714-96fc-1e8904bbcf1f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.725s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:08:09 compute-0 nova_compute[192698]: 2025-10-01 14:08:09.438 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:08:09 compute-0 nova_compute[192698]: 2025-10-01 14:08:09.439 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:08:09 compute-0 nova_compute[192698]: 2025-10-01 14:08:09.439 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:08:09 compute-0 nova_compute[192698]: 2025-10-01 14:08:09.440 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 01 14:08:09 compute-0 nova_compute[192698]: 2025-10-01 14:08:09.707 2 WARNING nova.virt.libvirt.driver [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:08:09 compute-0 nova_compute[192698]: 2025-10-01 14:08:09.710 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:08:09 compute-0 nova_compute[192698]: 2025-10-01 14:08:09.755 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.045s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:08:09 compute-0 nova_compute[192698]: 2025-10-01 14:08:09.757 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5826MB free_disk=73.30662155151367GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 01 14:08:09 compute-0 nova_compute[192698]: 2025-10-01 14:08:09.758 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:08:09 compute-0 nova_compute[192698]: 2025-10-01 14:08:09.758 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:08:10 compute-0 unix_chkpwd[218067]: password check failed for user (root)
Oct 01 14:08:10 compute-0 sshd-session[218065]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=91.224.92.28  user=root
Oct 01 14:08:10 compute-0 nova_compute[192698]: 2025-10-01 14:08:10.800 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 01 14:08:10 compute-0 nova_compute[192698]: 2025-10-01 14:08:10.802 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:08:09 up  1:07,  0 user,  load average: 0.54, 0.44, 0.51\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 01 14:08:10 compute-0 nova_compute[192698]: 2025-10-01 14:08:10.827 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:08:11 compute-0 nova_compute[192698]: 2025-10-01 14:08:11.336 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:08:11 compute-0 nova_compute[192698]: 2025-10-01 14:08:11.850 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 01 14:08:11 compute-0 nova_compute[192698]: 2025-10-01 14:08:11.851 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.093s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:08:12 compute-0 sshd-session[218065]: Failed password for root from 91.224.92.28 port 34554 ssh2
Oct 01 14:08:12 compute-0 nova_compute[192698]: 2025-10-01 14:08:12.910 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:08:13 compute-0 unix_chkpwd[218068]: password check failed for user (root)
Oct 01 14:08:13 compute-0 nova_compute[192698]: 2025-10-01 14:08:13.922 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:08:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:08:14.242 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:08:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:08:14.242 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:08:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:08:14.243 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:08:14 compute-0 nova_compute[192698]: 2025-10-01 14:08:14.851 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:08:14 compute-0 nova_compute[192698]: 2025-10-01 14:08:14.852 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:08:14 compute-0 nova_compute[192698]: 2025-10-01 14:08:14.852 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:08:14 compute-0 nova_compute[192698]: 2025-10-01 14:08:14.852 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:08:14 compute-0 nova_compute[192698]: 2025-10-01 14:08:14.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:08:14 compute-0 nova_compute[192698]: 2025-10-01 14:08:14.926 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 01 14:08:15 compute-0 sshd-session[218065]: Failed password for root from 91.224.92.28 port 34554 ssh2
Oct 01 14:08:15 compute-0 nova_compute[192698]: 2025-10-01 14:08:15.926 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:08:16 compute-0 unix_chkpwd[218070]: password check failed for user (root)
Oct 01 14:08:17 compute-0 sshd-session[218065]: Failed password for root from 91.224.92.28 port 34554 ssh2
Oct 01 14:08:17 compute-0 nova_compute[192698]: 2025-10-01 14:08:17.938 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:08:18 compute-0 podman[218071]: 2025-10-01 14:08:18.222876521 +0000 UTC m=+0.117424429 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.4)
Oct 01 14:08:18 compute-0 podman[218072]: 2025-10-01 14:08:18.242233481 +0000 UTC m=+0.129719829 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest)
Oct 01 14:08:18 compute-0 nova_compute[192698]: 2025-10-01 14:08:18.925 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:08:18 compute-0 sshd-session[218065]: Received disconnect from 91.224.92.28 port 34554:11:  [preauth]
Oct 01 14:08:18 compute-0 sshd-session[218065]: Disconnected from authenticating user root 91.224.92.28 port 34554 [preauth]
Oct 01 14:08:18 compute-0 sshd-session[218065]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=91.224.92.28  user=root
Oct 01 14:08:19 compute-0 unix_chkpwd[218116]: password check failed for user (root)
Oct 01 14:08:19 compute-0 sshd-session[218114]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=91.224.92.28  user=root
Oct 01 14:08:21 compute-0 sshd-session[218114]: Failed password for root from 91.224.92.28 port 24294 ssh2
Oct 01 14:08:22 compute-0 nova_compute[192698]: 2025-10-01 14:08:22.433 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:08:22 compute-0 nova_compute[192698]: 2025-10-01 14:08:22.434 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Oct 01 14:08:22 compute-0 unix_chkpwd[218117]: password check failed for user (root)
Oct 01 14:08:22 compute-0 nova_compute[192698]: 2025-10-01 14:08:22.941 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:08:22 compute-0 nova_compute[192698]: 2025-10-01 14:08:22.974 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Oct 01 14:08:23 compute-0 nova_compute[192698]: 2025-10-01 14:08:23.953 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:08:24 compute-0 sshd-session[218114]: Failed password for root from 91.224.92.28 port 24294 ssh2
Oct 01 14:08:25 compute-0 unix_chkpwd[218118]: password check failed for user (root)
Oct 01 14:08:26 compute-0 podman[218119]: 2025-10-01 14:08:26.193694245 +0000 UTC m=+0.102726674 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, version=9.6, io.openshift.expose-services=, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, config_id=edpm, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, release=1755695350)
Oct 01 14:08:26 compute-0 nova_compute[192698]: 2025-10-01 14:08:26.420 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:08:27 compute-0 sshd-session[218114]: Failed password for root from 91.224.92.28 port 24294 ssh2
Oct 01 14:08:27 compute-0 nova_compute[192698]: 2025-10-01 14:08:27.945 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:08:28 compute-0 sshd-session[218114]: Received disconnect from 91.224.92.28 port 24294:11:  [preauth]
Oct 01 14:08:28 compute-0 sshd-session[218114]: Disconnected from authenticating user root 91.224.92.28 port 24294 [preauth]
Oct 01 14:08:28 compute-0 sshd-session[218114]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=91.224.92.28  user=root
Oct 01 14:08:28 compute-0 nova_compute[192698]: 2025-10-01 14:08:28.955 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:08:29 compute-0 unix_chkpwd[218144]: password check failed for user (root)
Oct 01 14:08:29 compute-0 sshd-session[218142]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=91.224.92.28  user=root
Oct 01 14:08:29 compute-0 podman[203144]: time="2025-10-01T14:08:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:08:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:08:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:08:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:08:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3011 "" "Go-http-client/1.1"
Oct 01 14:08:30 compute-0 sshd-session[218142]: Failed password for root from 91.224.92.28 port 14200 ssh2
Oct 01 14:08:31 compute-0 openstack_network_exporter[205307]: ERROR   14:08:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:08:31 compute-0 openstack_network_exporter[205307]: ERROR   14:08:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:08:31 compute-0 openstack_network_exporter[205307]: ERROR   14:08:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:08:31 compute-0 openstack_network_exporter[205307]: ERROR   14:08:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:08:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:08:31 compute-0 openstack_network_exporter[205307]: ERROR   14:08:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:08:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:08:31 compute-0 unix_chkpwd[218145]: password check failed for user (root)
Oct 01 14:08:32 compute-0 podman[218146]: 2025-10-01 14:08:32.186717253 +0000 UTC m=+0.093931477 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, container_name=iscsid, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest)
Oct 01 14:08:32 compute-0 podman[218147]: 2025-10-01 14:08:32.196189728 +0000 UTC m=+0.097779871 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4)
Oct 01 14:08:32 compute-0 nova_compute[192698]: 2025-10-01 14:08:32.946 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:08:33 compute-0 sshd-session[218142]: Failed password for root from 91.224.92.28 port 14200 ssh2
Oct 01 14:08:33 compute-0 nova_compute[192698]: 2025-10-01 14:08:33.995 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:08:34 compute-0 unix_chkpwd[218183]: password check failed for user (root)
Oct 01 14:08:36 compute-0 sshd-session[218142]: Failed password for root from 91.224.92.28 port 14200 ssh2
Oct 01 14:08:37 compute-0 sshd-session[218142]: Received disconnect from 91.224.92.28 port 14200:11:  [preauth]
Oct 01 14:08:37 compute-0 sshd-session[218142]: Disconnected from authenticating user root 91.224.92.28 port 14200 [preauth]
Oct 01 14:08:37 compute-0 sshd-session[218142]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=91.224.92.28  user=root
Oct 01 14:08:37 compute-0 nova_compute[192698]: 2025-10-01 14:08:37.948 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:08:38 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:08:38.938 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f4:12:cf 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-becc357a-665d-42a0-9440-5383962ecf85', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-becc357a-665d-42a0-9440-5383962ecf85', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '066bb0cdf38a41b786fd15af0a2c834e', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=122eb5c6-8eb5-4891-90ef-718c58d07d03, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=c1b3cbbb-bd0b-4d63-8ed5-cbe5f193869a) old=Port_Binding(mac=['fa:16:3e:f4:12:cf'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-becc357a-665d-42a0-9440-5383962ecf85', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-becc357a-665d-42a0-9440-5383962ecf85', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '066bb0cdf38a41b786fd15af0a2c834e', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:08:38 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:08:38.939 103791 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port c1b3cbbb-bd0b-4d63-8ed5-cbe5f193869a in datapath becc357a-665d-42a0-9440-5383962ecf85 updated
Oct 01 14:08:38 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:08:38.940 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network becc357a-665d-42a0-9440-5383962ecf85, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 01 14:08:38 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:08:38.942 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[2a440ce2-52dd-4f73-a978-1af630c128b3]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:08:39 compute-0 nova_compute[192698]: 2025-10-01 14:08:39.041 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:08:39 compute-0 podman[218184]: 2025-10-01 14:08:39.173820797 +0000 UTC m=+0.083734133 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 01 14:08:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:08:41.568 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'e2:3f:3c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:1d:a6:67:ed:e6'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:08:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:08:41.569 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 01 14:08:41 compute-0 nova_compute[192698]: 2025-10-01 14:08:41.609 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:08:42 compute-0 nova_compute[192698]: 2025-10-01 14:08:42.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:08:44 compute-0 nova_compute[192698]: 2025-10-01 14:08:44.068 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:08:44 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:08:44.571 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=10cf9814-09fa-4bad-879a-270f9b64eda3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:08:47 compute-0 nova_compute[192698]: 2025-10-01 14:08:47.988 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:08:48 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:08:48.394 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:02:c0:db 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-38752ddf-09c3-4ec2-8695-bc44d239b96b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-38752ddf-09c3-4ec2-8695-bc44d239b96b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6bf51d775b7c4c15a0326680d214c2bd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=450469f4-e458-4c38-a33f-ed323d6392f1, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=5336ffd9-ba5c-4b21-9536-403550e1266f) old=Port_Binding(mac=['fa:16:3e:02:c0:db'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-38752ddf-09c3-4ec2-8695-bc44d239b96b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-38752ddf-09c3-4ec2-8695-bc44d239b96b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6bf51d775b7c4c15a0326680d214c2bd', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:08:48 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:08:48.396 103791 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 5336ffd9-ba5c-4b21-9536-403550e1266f in datapath 38752ddf-09c3-4ec2-8695-bc44d239b96b updated
Oct 01 14:08:48 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:08:48.397 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 38752ddf-09c3-4ec2-8695-bc44d239b96b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 01 14:08:48 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:08:48.398 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[957a99e5-b3ab-4a7f-9e00-ac29c64bfb34]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:08:49 compute-0 nova_compute[192698]: 2025-10-01 14:08:49.103 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:08:49 compute-0 podman[218209]: 2025-10-01 14:08:49.178183946 +0000 UTC m=+0.090851745 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true)
Oct 01 14:08:49 compute-0 podman[218210]: 2025-10-01 14:08:49.257679884 +0000 UTC m=+0.134146459 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 01 14:08:52 compute-0 nova_compute[192698]: 2025-10-01 14:08:52.991 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:08:54 compute-0 nova_compute[192698]: 2025-10-01 14:08:54.131 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:08:57 compute-0 podman[218252]: 2025-10-01 14:08:57.177372662 +0000 UTC m=+0.087045422 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, managed_by=edpm_ansible, maintainer=Red Hat, Inc., name=ubi9-minimal, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.buildah.version=1.33.7, release=1755695350)
Oct 01 14:08:57 compute-0 nova_compute[192698]: 2025-10-01 14:08:57.995 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:08:59 compute-0 nova_compute[192698]: 2025-10-01 14:08:59.168 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:08:59 compute-0 podman[203144]: time="2025-10-01T14:08:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:08:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:08:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:08:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:08:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3010 "" "Go-http-client/1.1"
Oct 01 14:09:00 compute-0 ovn_controller[94909]: 2025-10-01T14:09:00Z|00087|memory_trim|INFO|Detected inactivity (last active 30007 ms ago): trimming memory
Oct 01 14:09:01 compute-0 openstack_network_exporter[205307]: ERROR   14:09:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:09:01 compute-0 openstack_network_exporter[205307]: ERROR   14:09:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:09:01 compute-0 openstack_network_exporter[205307]: ERROR   14:09:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:09:01 compute-0 openstack_network_exporter[205307]: ERROR   14:09:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:09:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:09:01 compute-0 openstack_network_exporter[205307]: ERROR   14:09:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:09:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:09:03 compute-0 nova_compute[192698]: 2025-10-01 14:09:03.044 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:09:03 compute-0 podman[218276]: 2025-10-01 14:09:03.184627902 +0000 UTC m=+0.091199284 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=iscsid, org.label-schema.schema-version=1.0, container_name=iscsid, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct 01 14:09:03 compute-0 podman[218277]: 2025-10-01 14:09:03.1901299 +0000 UTC m=+0.095577802 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4)
Oct 01 14:09:04 compute-0 nova_compute[192698]: 2025-10-01 14:09:04.170 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:09:08 compute-0 nova_compute[192698]: 2025-10-01 14:09:08.081 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:09:08 compute-0 nova_compute[192698]: 2025-10-01 14:09:08.467 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:09:08 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Oct 01 14:09:08 compute-0 nova_compute[192698]: 2025-10-01 14:09:08.924 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:09:09 compute-0 nova_compute[192698]: 2025-10-01 14:09:09.220 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:09:09 compute-0 nova_compute[192698]: 2025-10-01 14:09:09.517 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:09:09 compute-0 nova_compute[192698]: 2025-10-01 14:09:09.518 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:09:09 compute-0 nova_compute[192698]: 2025-10-01 14:09:09.519 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:09:09 compute-0 nova_compute[192698]: 2025-10-01 14:09:09.519 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 01 14:09:09 compute-0 nova_compute[192698]: 2025-10-01 14:09:09.751 2 WARNING nova.virt.libvirt.driver [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:09:09 compute-0 nova_compute[192698]: 2025-10-01 14:09:09.753 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:09:09 compute-0 nova_compute[192698]: 2025-10-01 14:09:09.792 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.039s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:09:09 compute-0 nova_compute[192698]: 2025-10-01 14:09:09.794 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5836MB free_disk=73.3066291809082GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 01 14:09:09 compute-0 nova_compute[192698]: 2025-10-01 14:09:09.794 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:09:09 compute-0 nova_compute[192698]: 2025-10-01 14:09:09.794 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:09:10 compute-0 podman[218316]: 2025-10-01 14:09:10.189236095 +0000 UTC m=+0.099015214 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 01 14:09:10 compute-0 nova_compute[192698]: 2025-10-01 14:09:10.923 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 01 14:09:10 compute-0 nova_compute[192698]: 2025-10-01 14:09:10.924 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:09:09 up  1:08,  0 user,  load average: 0.23, 0.37, 0.48\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 01 14:09:10 compute-0 nova_compute[192698]: 2025-10-01 14:09:10.971 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:09:11 compute-0 nova_compute[192698]: 2025-10-01 14:09:11.483 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:09:11 compute-0 nova_compute[192698]: 2025-10-01 14:09:11.995 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 01 14:09:11 compute-0 nova_compute[192698]: 2025-10-01 14:09:11.995 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.201s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:09:13 compute-0 nova_compute[192698]: 2025-10-01 14:09:13.139 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:09:13 compute-0 nova_compute[192698]: 2025-10-01 14:09:13.996 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:09:13 compute-0 nova_compute[192698]: 2025-10-01 14:09:13.997 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:09:13 compute-0 nova_compute[192698]: 2025-10-01 14:09:13.998 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:09:13 compute-0 nova_compute[192698]: 2025-10-01 14:09:13.999 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:09:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:09:14.244 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:09:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:09:14.245 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:09:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:09:14.245 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:09:14 compute-0 nova_compute[192698]: 2025-10-01 14:09:14.261 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:09:14 compute-0 nova_compute[192698]: 2025-10-01 14:09:14.916 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:09:15 compute-0 nova_compute[192698]: 2025-10-01 14:09:15.454 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:09:16 compute-0 nova_compute[192698]: 2025-10-01 14:09:16.926 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:09:16 compute-0 nova_compute[192698]: 2025-10-01 14:09:16.926 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 01 14:09:18 compute-0 nova_compute[192698]: 2025-10-01 14:09:18.140 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:09:19 compute-0 nova_compute[192698]: 2025-10-01 14:09:19.325 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:09:20 compute-0 podman[218342]: 2025-10-01 14:09:20.163906687 +0000 UTC m=+0.068153224 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct 01 14:09:20 compute-0 podman[218343]: 2025-10-01 14:09:20.230792147 +0000 UTC m=+0.130914613 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Oct 01 14:09:21 compute-0 nova_compute[192698]: 2025-10-01 14:09:21.731 2 DEBUG oslo_concurrency.lockutils [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] Acquiring lock "ff58d640-84a3-4709-9a4a-084f3deaac0c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:09:21 compute-0 nova_compute[192698]: 2025-10-01 14:09:21.732 2 DEBUG oslo_concurrency.lockutils [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] Lock "ff58d640-84a3-4709-9a4a-084f3deaac0c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:09:22 compute-0 nova_compute[192698]: 2025-10-01 14:09:22.237 2 DEBUG nova.compute.manager [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Oct 01 14:09:22 compute-0 nova_compute[192698]: 2025-10-01 14:09:22.792 2 DEBUG oslo_concurrency.lockutils [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:09:22 compute-0 nova_compute[192698]: 2025-10-01 14:09:22.793 2 DEBUG oslo_concurrency.lockutils [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:09:22 compute-0 nova_compute[192698]: 2025-10-01 14:09:22.800 2 DEBUG nova.virt.hardware [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Oct 01 14:09:22 compute-0 nova_compute[192698]: 2025-10-01 14:09:22.801 2 INFO nova.compute.claims [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Claim successful on node compute-0.ctlplane.example.com
Oct 01 14:09:23 compute-0 nova_compute[192698]: 2025-10-01 14:09:23.144 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:09:23 compute-0 nova_compute[192698]: 2025-10-01 14:09:23.908 2 DEBUG nova.compute.provider_tree [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:09:24 compute-0 nova_compute[192698]: 2025-10-01 14:09:24.328 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:09:24 compute-0 nova_compute[192698]: 2025-10-01 14:09:24.418 2 DEBUG nova.scheduler.client.report [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:09:24 compute-0 nova_compute[192698]: 2025-10-01 14:09:24.930 2 DEBUG oslo_concurrency.lockutils [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.137s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:09:24 compute-0 nova_compute[192698]: 2025-10-01 14:09:24.932 2 DEBUG nova.compute.manager [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Oct 01 14:09:25 compute-0 nova_compute[192698]: 2025-10-01 14:09:25.444 2 DEBUG nova.compute.manager [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Oct 01 14:09:25 compute-0 nova_compute[192698]: 2025-10-01 14:09:25.445 2 DEBUG nova.network.neutron [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Oct 01 14:09:25 compute-0 nova_compute[192698]: 2025-10-01 14:09:25.446 2 WARNING neutronclient.v2_0.client [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:09:25 compute-0 nova_compute[192698]: 2025-10-01 14:09:25.446 2 WARNING neutronclient.v2_0.client [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:09:25 compute-0 nova_compute[192698]: 2025-10-01 14:09:25.955 2 INFO nova.virt.libvirt.driver [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 01 14:09:25 compute-0 nova_compute[192698]: 2025-10-01 14:09:25.983 2 DEBUG nova.network.neutron [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Successfully created port: c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Oct 01 14:09:26 compute-0 nova_compute[192698]: 2025-10-01 14:09:26.465 2 DEBUG nova.compute.manager [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Oct 01 14:09:27 compute-0 nova_compute[192698]: 2025-10-01 14:09:27.492 2 DEBUG nova.compute.manager [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Oct 01 14:09:27 compute-0 nova_compute[192698]: 2025-10-01 14:09:27.494 2 DEBUG nova.virt.libvirt.driver [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Oct 01 14:09:27 compute-0 nova_compute[192698]: 2025-10-01 14:09:27.494 2 INFO nova.virt.libvirt.driver [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Creating image(s)
Oct 01 14:09:27 compute-0 nova_compute[192698]: 2025-10-01 14:09:27.495 2 DEBUG oslo_concurrency.lockutils [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] Acquiring lock "/var/lib/nova/instances/ff58d640-84a3-4709-9a4a-084f3deaac0c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:09:27 compute-0 nova_compute[192698]: 2025-10-01 14:09:27.495 2 DEBUG oslo_concurrency.lockutils [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] Lock "/var/lib/nova/instances/ff58d640-84a3-4709-9a4a-084f3deaac0c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:09:27 compute-0 nova_compute[192698]: 2025-10-01 14:09:27.496 2 DEBUG oslo_concurrency.lockutils [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] Lock "/var/lib/nova/instances/ff58d640-84a3-4709-9a4a-084f3deaac0c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:09:27 compute-0 nova_compute[192698]: 2025-10-01 14:09:27.496 2 DEBUG oslo_utils.imageutils.format_inspector [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:09:27 compute-0 nova_compute[192698]: 2025-10-01 14:09:27.499 2 DEBUG oslo_utils.imageutils.format_inspector [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:09:27 compute-0 nova_compute[192698]: 2025-10-01 14:09:27.505 2 DEBUG oslo_concurrency.processutils [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:09:27 compute-0 nova_compute[192698]: 2025-10-01 14:09:27.597 2 DEBUG oslo_concurrency.processutils [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:09:27 compute-0 nova_compute[192698]: 2025-10-01 14:09:27.599 2 DEBUG oslo_concurrency.lockutils [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] Acquiring lock "f477473ce09fdc00484ca839f539813eb2fee546" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:09:27 compute-0 nova_compute[192698]: 2025-10-01 14:09:27.599 2 DEBUG oslo_concurrency.lockutils [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] Lock "f477473ce09fdc00484ca839f539813eb2fee546" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:09:27 compute-0 nova_compute[192698]: 2025-10-01 14:09:27.600 2 DEBUG oslo_utils.imageutils.format_inspector [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:09:27 compute-0 nova_compute[192698]: 2025-10-01 14:09:27.605 2 DEBUG oslo_utils.imageutils.format_inspector [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:09:27 compute-0 nova_compute[192698]: 2025-10-01 14:09:27.606 2 DEBUG oslo_concurrency.processutils [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:09:27 compute-0 nova_compute[192698]: 2025-10-01 14:09:27.707 2 DEBUG oslo_concurrency.processutils [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.101s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:09:27 compute-0 nova_compute[192698]: 2025-10-01 14:09:27.709 2 DEBUG oslo_concurrency.processutils [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546,backing_fmt=raw /var/lib/nova/instances/ff58d640-84a3-4709-9a4a-084f3deaac0c/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:09:27 compute-0 nova_compute[192698]: 2025-10-01 14:09:27.756 2 DEBUG oslo_concurrency.processutils [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546,backing_fmt=raw /var/lib/nova/instances/ff58d640-84a3-4709-9a4a-084f3deaac0c/disk 1073741824" returned: 0 in 0.048s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:09:27 compute-0 nova_compute[192698]: 2025-10-01 14:09:27.757 2 DEBUG oslo_concurrency.lockutils [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] Lock "f477473ce09fdc00484ca839f539813eb2fee546" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.158s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:09:27 compute-0 nova_compute[192698]: 2025-10-01 14:09:27.758 2 DEBUG oslo_concurrency.processutils [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:09:27 compute-0 nova_compute[192698]: 2025-10-01 14:09:27.834 2 DEBUG oslo_concurrency.processutils [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:09:27 compute-0 nova_compute[192698]: 2025-10-01 14:09:27.835 2 DEBUG nova.virt.disk.api [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] Checking if we can resize image /var/lib/nova/instances/ff58d640-84a3-4709-9a4a-084f3deaac0c/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 01 14:09:27 compute-0 nova_compute[192698]: 2025-10-01 14:09:27.835 2 DEBUG oslo_concurrency.processutils [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ff58d640-84a3-4709-9a4a-084f3deaac0c/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:09:27 compute-0 nova_compute[192698]: 2025-10-01 14:09:27.918 2 DEBUG oslo_concurrency.processutils [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ff58d640-84a3-4709-9a4a-084f3deaac0c/disk --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:09:27 compute-0 nova_compute[192698]: 2025-10-01 14:09:27.920 2 DEBUG nova.virt.disk.api [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] Cannot resize image /var/lib/nova/instances/ff58d640-84a3-4709-9a4a-084f3deaac0c/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 01 14:09:27 compute-0 nova_compute[192698]: 2025-10-01 14:09:27.920 2 DEBUG nova.virt.libvirt.driver [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Oct 01 14:09:27 compute-0 nova_compute[192698]: 2025-10-01 14:09:27.921 2 DEBUG nova.virt.libvirt.driver [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Ensure instance console log exists: /var/lib/nova/instances/ff58d640-84a3-4709-9a4a-084f3deaac0c/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Oct 01 14:09:27 compute-0 nova_compute[192698]: 2025-10-01 14:09:27.922 2 DEBUG oslo_concurrency.lockutils [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:09:27 compute-0 nova_compute[192698]: 2025-10-01 14:09:27.922 2 DEBUG oslo_concurrency.lockutils [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:09:27 compute-0 nova_compute[192698]: 2025-10-01 14:09:27.923 2 DEBUG oslo_concurrency.lockutils [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:09:28 compute-0 nova_compute[192698]: 2025-10-01 14:09:28.147 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:09:28 compute-0 podman[218403]: 2025-10-01 14:09:28.169555047 +0000 UTC m=+0.085396058 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, release=1755695350, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, distribution-scope=public, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible)
Oct 01 14:09:28 compute-0 nova_compute[192698]: 2025-10-01 14:09:28.198 2 DEBUG nova.network.neutron [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Successfully updated port: c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Oct 01 14:09:28 compute-0 nova_compute[192698]: 2025-10-01 14:09:28.252 2 DEBUG nova.compute.manager [req-84572853-fbbe-45cf-bd23-d4b2cea20939 req-e697efbf-b7fb-4539-94f4-1d5967edd603 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Received event network-changed-c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:09:28 compute-0 nova_compute[192698]: 2025-10-01 14:09:28.252 2 DEBUG nova.compute.manager [req-84572853-fbbe-45cf-bd23-d4b2cea20939 req-e697efbf-b7fb-4539-94f4-1d5967edd603 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Refreshing instance network info cache due to event network-changed-c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Oct 01 14:09:28 compute-0 nova_compute[192698]: 2025-10-01 14:09:28.253 2 DEBUG oslo_concurrency.lockutils [req-84572853-fbbe-45cf-bd23-d4b2cea20939 req-e697efbf-b7fb-4539-94f4-1d5967edd603 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "refresh_cache-ff58d640-84a3-4709-9a4a-084f3deaac0c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 01 14:09:28 compute-0 nova_compute[192698]: 2025-10-01 14:09:28.253 2 DEBUG oslo_concurrency.lockutils [req-84572853-fbbe-45cf-bd23-d4b2cea20939 req-e697efbf-b7fb-4539-94f4-1d5967edd603 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquired lock "refresh_cache-ff58d640-84a3-4709-9a4a-084f3deaac0c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 01 14:09:28 compute-0 nova_compute[192698]: 2025-10-01 14:09:28.254 2 DEBUG nova.network.neutron [req-84572853-fbbe-45cf-bd23-d4b2cea20939 req-e697efbf-b7fb-4539-94f4-1d5967edd603 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Refreshing network info cache for port c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Oct 01 14:09:28 compute-0 nova_compute[192698]: 2025-10-01 14:09:28.707 2 DEBUG oslo_concurrency.lockutils [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] Acquiring lock "refresh_cache-ff58d640-84a3-4709-9a4a-084f3deaac0c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 01 14:09:28 compute-0 nova_compute[192698]: 2025-10-01 14:09:28.763 2 WARNING neutronclient.v2_0.client [req-84572853-fbbe-45cf-bd23-d4b2cea20939 req-e697efbf-b7fb-4539-94f4-1d5967edd603 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:09:29 compute-0 nova_compute[192698]: 2025-10-01 14:09:29.092 2 DEBUG nova.network.neutron [req-84572853-fbbe-45cf-bd23-d4b2cea20939 req-e697efbf-b7fb-4539-94f4-1d5967edd603 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 01 14:09:29 compute-0 nova_compute[192698]: 2025-10-01 14:09:29.283 2 DEBUG nova.network.neutron [req-84572853-fbbe-45cf-bd23-d4b2cea20939 req-e697efbf-b7fb-4539-94f4-1d5967edd603 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:09:29 compute-0 nova_compute[192698]: 2025-10-01 14:09:29.331 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:09:29 compute-0 podman[203144]: time="2025-10-01T14:09:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:09:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:09:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:09:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:09:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3017 "" "Go-http-client/1.1"
Oct 01 14:09:29 compute-0 nova_compute[192698]: 2025-10-01 14:09:29.790 2 DEBUG oslo_concurrency.lockutils [req-84572853-fbbe-45cf-bd23-d4b2cea20939 req-e697efbf-b7fb-4539-94f4-1d5967edd603 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Releasing lock "refresh_cache-ff58d640-84a3-4709-9a4a-084f3deaac0c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 01 14:09:29 compute-0 nova_compute[192698]: 2025-10-01 14:09:29.791 2 DEBUG oslo_concurrency.lockutils [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] Acquired lock "refresh_cache-ff58d640-84a3-4709-9a4a-084f3deaac0c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 01 14:09:29 compute-0 nova_compute[192698]: 2025-10-01 14:09:29.792 2 DEBUG nova.network.neutron [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 01 14:09:31 compute-0 nova_compute[192698]: 2025-10-01 14:09:31.075 2 DEBUG nova.network.neutron [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 01 14:09:31 compute-0 nova_compute[192698]: 2025-10-01 14:09:31.326 2 WARNING neutronclient.v2_0.client [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:09:31 compute-0 openstack_network_exporter[205307]: ERROR   14:09:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:09:31 compute-0 openstack_network_exporter[205307]: ERROR   14:09:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:09:31 compute-0 openstack_network_exporter[205307]: ERROR   14:09:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:09:31 compute-0 openstack_network_exporter[205307]: ERROR   14:09:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:09:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:09:31 compute-0 openstack_network_exporter[205307]: ERROR   14:09:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:09:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:09:32 compute-0 nova_compute[192698]: 2025-10-01 14:09:32.198 2 DEBUG nova.network.neutron [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Updating instance_info_cache with network_info: [{"id": "c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4", "address": "fa:16:3e:47:d6:1f", "network": {"id": "becc357a-665d-42a0-9440-5383962ecf85", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-159886170-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "066bb0cdf38a41b786fd15af0a2c834e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8f94db4-b8", "ovs_interfaceid": "c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:09:32 compute-0 nova_compute[192698]: 2025-10-01 14:09:32.707 2 DEBUG oslo_concurrency.lockutils [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] Releasing lock "refresh_cache-ff58d640-84a3-4709-9a4a-084f3deaac0c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 01 14:09:32 compute-0 nova_compute[192698]: 2025-10-01 14:09:32.708 2 DEBUG nova.compute.manager [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Instance network_info: |[{"id": "c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4", "address": "fa:16:3e:47:d6:1f", "network": {"id": "becc357a-665d-42a0-9440-5383962ecf85", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-159886170-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "066bb0cdf38a41b786fd15af0a2c834e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8f94db4-b8", "ovs_interfaceid": "c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Oct 01 14:09:32 compute-0 nova_compute[192698]: 2025-10-01 14:09:32.712 2 DEBUG nova.virt.libvirt.driver [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Start _get_guest_xml network_info=[{"id": "c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4", "address": "fa:16:3e:47:d6:1f", "network": {"id": "becc357a-665d-42a0-9440-5383962ecf85", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-159886170-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "066bb0cdf38a41b786fd15af0a2c834e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8f94db4-b8", "ovs_interfaceid": "c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-01T13:57:39Z,direct_url=<?>,disk_format='qcow2',id=48696e9b-a20d-4bf6-8ac2-6438fe748ab6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9dacac6049d34f02846f752af09ae16f',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-01T13:57:40Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encrypted': False, 'encryption_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_options': None, 'device_name': '/dev/vda', 'guest_format': None, 'image_id': '48696e9b-a20d-4bf6-8ac2-6438fe748ab6'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Oct 01 14:09:32 compute-0 nova_compute[192698]: 2025-10-01 14:09:32.718 2 WARNING nova.virt.libvirt.driver [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:09:32 compute-0 nova_compute[192698]: 2025-10-01 14:09:32.720 2 DEBUG nova.virt.driver [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='48696e9b-a20d-4bf6-8ac2-6438fe748ab6', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteBasicStrategy-server-1583961141', uuid='ff58d640-84a3-4709-9a4a-084f3deaac0c'), owner=OwnerMeta(userid='fc564881007a4754ade24ed65141e269', username='tempest-TestExecuteBasicStrategy-716451052-project-admin', projectid='6bf51d775b7c4c15a0326680d214c2bd', projectname='tempest-TestExecuteBasicStrategy-716451052'), image=ImageMeta(id='48696e9b-a20d-4bf6-8ac2-6438fe748ab6', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='69702c4b-38f2-49d1-96d5-85671652c67e', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4", "address": "fa:16:3e:47:d6:1f", "network": {"id": "becc357a-665d-42a0-9440-5383962ecf85", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-159886170-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "066bb0cdf38a41b786fd15af0a2c834e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8f94db4-b8", "ovs_interfaceid": "c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759327772.7205982) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Oct 01 14:09:32 compute-0 nova_compute[192698]: 2025-10-01 14:09:32.725 2 DEBUG nova.virt.libvirt.host [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Oct 01 14:09:32 compute-0 nova_compute[192698]: 2025-10-01 14:09:32.726 2 DEBUG nova.virt.libvirt.host [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Oct 01 14:09:32 compute-0 nova_compute[192698]: 2025-10-01 14:09:32.730 2 DEBUG nova.virt.libvirt.host [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Oct 01 14:09:32 compute-0 nova_compute[192698]: 2025-10-01 14:09:32.731 2 DEBUG nova.virt.libvirt.host [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Oct 01 14:09:32 compute-0 nova_compute[192698]: 2025-10-01 14:09:32.731 2 DEBUG nova.virt.libvirt.driver [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Oct 01 14:09:32 compute-0 nova_compute[192698]: 2025-10-01 14:09:32.732 2 DEBUG nova.virt.hardware [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-01T13:57:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='69702c4b-38f2-49d1-96d5-85671652c67e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-01T13:57:39Z,direct_url=<?>,disk_format='qcow2',id=48696e9b-a20d-4bf6-8ac2-6438fe748ab6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9dacac6049d34f02846f752af09ae16f',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-01T13:57:40Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Oct 01 14:09:32 compute-0 nova_compute[192698]: 2025-10-01 14:09:32.732 2 DEBUG nova.virt.hardware [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Oct 01 14:09:32 compute-0 nova_compute[192698]: 2025-10-01 14:09:32.733 2 DEBUG nova.virt.hardware [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Oct 01 14:09:32 compute-0 nova_compute[192698]: 2025-10-01 14:09:32.733 2 DEBUG nova.virt.hardware [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Oct 01 14:09:32 compute-0 nova_compute[192698]: 2025-10-01 14:09:32.733 2 DEBUG nova.virt.hardware [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Oct 01 14:09:32 compute-0 nova_compute[192698]: 2025-10-01 14:09:32.734 2 DEBUG nova.virt.hardware [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Oct 01 14:09:32 compute-0 nova_compute[192698]: 2025-10-01 14:09:32.734 2 DEBUG nova.virt.hardware [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Oct 01 14:09:32 compute-0 nova_compute[192698]: 2025-10-01 14:09:32.735 2 DEBUG nova.virt.hardware [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Oct 01 14:09:32 compute-0 nova_compute[192698]: 2025-10-01 14:09:32.735 2 DEBUG nova.virt.hardware [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Oct 01 14:09:32 compute-0 nova_compute[192698]: 2025-10-01 14:09:32.735 2 DEBUG nova.virt.hardware [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Oct 01 14:09:32 compute-0 nova_compute[192698]: 2025-10-01 14:09:32.736 2 DEBUG nova.virt.hardware [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Oct 01 14:09:32 compute-0 nova_compute[192698]: 2025-10-01 14:09:32.742 2 DEBUG nova.virt.libvirt.vif [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-01T14:09:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-1583961141',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-1583961141',id=11,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6bf51d775b7c4c15a0326680d214c2bd',ramdisk_id='',reservation_id='r-xzdsyszl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteBasicStrategy-716451052',owner_user_name='tempest-TestExecuteBasicStrategy-716451052-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-01T14:09:26Z,user_data=None,user_id='fc564881007a4754ade24ed65141e269',uuid=ff58d640-84a3-4709-9a4a-084f3deaac0c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4", "address": "fa:16:3e:47:d6:1f", "network": {"id": "becc357a-665d-42a0-9440-5383962ecf85", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-159886170-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "066bb0cdf38a41b786fd15af0a2c834e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8f94db4-b8", "ovs_interfaceid": "c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Oct 01 14:09:32 compute-0 nova_compute[192698]: 2025-10-01 14:09:32.742 2 DEBUG nova.network.os_vif_util [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] Converting VIF {"id": "c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4", "address": "fa:16:3e:47:d6:1f", "network": {"id": "becc357a-665d-42a0-9440-5383962ecf85", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-159886170-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "066bb0cdf38a41b786fd15af0a2c834e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8f94db4-b8", "ovs_interfaceid": "c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:09:32 compute-0 nova_compute[192698]: 2025-10-01 14:09:32.744 2 DEBUG nova.network.os_vif_util [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:47:d6:1f,bridge_name='br-int',has_traffic_filtering=True,id=c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4,network=Network(becc357a-665d-42a0-9440-5383962ecf85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8f94db4-b8') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:09:32 compute-0 nova_compute[192698]: 2025-10-01 14:09:32.745 2 DEBUG nova.objects.instance [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] Lazy-loading 'pci_devices' on Instance uuid ff58d640-84a3-4709-9a4a-084f3deaac0c obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:09:33 compute-0 nova_compute[192698]: 2025-10-01 14:09:33.148 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:09:33 compute-0 nova_compute[192698]: 2025-10-01 14:09:33.253 2 DEBUG nova.virt.libvirt.driver [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] End _get_guest_xml xml=<domain type="kvm">
Oct 01 14:09:33 compute-0 nova_compute[192698]:   <uuid>ff58d640-84a3-4709-9a4a-084f3deaac0c</uuid>
Oct 01 14:09:33 compute-0 nova_compute[192698]:   <name>instance-0000000b</name>
Oct 01 14:09:33 compute-0 nova_compute[192698]:   <memory>131072</memory>
Oct 01 14:09:33 compute-0 nova_compute[192698]:   <vcpu>1</vcpu>
Oct 01 14:09:33 compute-0 nova_compute[192698]:   <metadata>
Oct 01 14:09:33 compute-0 nova_compute[192698]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 01 14:09:33 compute-0 nova_compute[192698]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Oct 01 14:09:33 compute-0 nova_compute[192698]:       <nova:name>tempest-TestExecuteBasicStrategy-server-1583961141</nova:name>
Oct 01 14:09:33 compute-0 nova_compute[192698]:       <nova:creationTime>2025-10-01 14:09:32</nova:creationTime>
Oct 01 14:09:33 compute-0 nova_compute[192698]:       <nova:flavor name="m1.nano" id="69702c4b-38f2-49d1-96d5-85671652c67e">
Oct 01 14:09:33 compute-0 nova_compute[192698]:         <nova:memory>128</nova:memory>
Oct 01 14:09:33 compute-0 nova_compute[192698]:         <nova:disk>1</nova:disk>
Oct 01 14:09:33 compute-0 nova_compute[192698]:         <nova:swap>0</nova:swap>
Oct 01 14:09:33 compute-0 nova_compute[192698]:         <nova:ephemeral>0</nova:ephemeral>
Oct 01 14:09:33 compute-0 nova_compute[192698]:         <nova:vcpus>1</nova:vcpus>
Oct 01 14:09:33 compute-0 nova_compute[192698]:         <nova:extraSpecs>
Oct 01 14:09:33 compute-0 nova_compute[192698]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Oct 01 14:09:33 compute-0 nova_compute[192698]:         </nova:extraSpecs>
Oct 01 14:09:33 compute-0 nova_compute[192698]:       </nova:flavor>
Oct 01 14:09:33 compute-0 nova_compute[192698]:       <nova:image uuid="48696e9b-a20d-4bf6-8ac2-6438fe748ab6">
Oct 01 14:09:33 compute-0 nova_compute[192698]:         <nova:containerFormat>bare</nova:containerFormat>
Oct 01 14:09:33 compute-0 nova_compute[192698]:         <nova:diskFormat>qcow2</nova:diskFormat>
Oct 01 14:09:33 compute-0 nova_compute[192698]:         <nova:minDisk>1</nova:minDisk>
Oct 01 14:09:33 compute-0 nova_compute[192698]:         <nova:minRam>0</nova:minRam>
Oct 01 14:09:33 compute-0 nova_compute[192698]:         <nova:properties>
Oct 01 14:09:33 compute-0 nova_compute[192698]:           <nova:property name="hw_rng_model">virtio</nova:property>
Oct 01 14:09:33 compute-0 nova_compute[192698]:         </nova:properties>
Oct 01 14:09:33 compute-0 nova_compute[192698]:       </nova:image>
Oct 01 14:09:33 compute-0 nova_compute[192698]:       <nova:owner>
Oct 01 14:09:33 compute-0 nova_compute[192698]:         <nova:user uuid="fc564881007a4754ade24ed65141e269">tempest-TestExecuteBasicStrategy-716451052-project-admin</nova:user>
Oct 01 14:09:33 compute-0 nova_compute[192698]:         <nova:project uuid="6bf51d775b7c4c15a0326680d214c2bd">tempest-TestExecuteBasicStrategy-716451052</nova:project>
Oct 01 14:09:33 compute-0 nova_compute[192698]:       </nova:owner>
Oct 01 14:09:33 compute-0 nova_compute[192698]:       <nova:root type="image" uuid="48696e9b-a20d-4bf6-8ac2-6438fe748ab6"/>
Oct 01 14:09:33 compute-0 nova_compute[192698]:       <nova:ports>
Oct 01 14:09:33 compute-0 nova_compute[192698]:         <nova:port uuid="c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4">
Oct 01 14:09:33 compute-0 nova_compute[192698]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Oct 01 14:09:33 compute-0 nova_compute[192698]:         </nova:port>
Oct 01 14:09:33 compute-0 nova_compute[192698]:       </nova:ports>
Oct 01 14:09:33 compute-0 nova_compute[192698]:     </nova:instance>
Oct 01 14:09:33 compute-0 nova_compute[192698]:   </metadata>
Oct 01 14:09:33 compute-0 nova_compute[192698]:   <sysinfo type="smbios">
Oct 01 14:09:33 compute-0 nova_compute[192698]:     <system>
Oct 01 14:09:33 compute-0 nova_compute[192698]:       <entry name="manufacturer">RDO</entry>
Oct 01 14:09:33 compute-0 nova_compute[192698]:       <entry name="product">OpenStack Compute</entry>
Oct 01 14:09:33 compute-0 nova_compute[192698]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Oct 01 14:09:33 compute-0 nova_compute[192698]:       <entry name="serial">ff58d640-84a3-4709-9a4a-084f3deaac0c</entry>
Oct 01 14:09:33 compute-0 nova_compute[192698]:       <entry name="uuid">ff58d640-84a3-4709-9a4a-084f3deaac0c</entry>
Oct 01 14:09:33 compute-0 nova_compute[192698]:       <entry name="family">Virtual Machine</entry>
Oct 01 14:09:33 compute-0 nova_compute[192698]:     </system>
Oct 01 14:09:33 compute-0 nova_compute[192698]:   </sysinfo>
Oct 01 14:09:33 compute-0 nova_compute[192698]:   <os>
Oct 01 14:09:33 compute-0 nova_compute[192698]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 01 14:09:33 compute-0 nova_compute[192698]:     <boot dev="hd"/>
Oct 01 14:09:33 compute-0 nova_compute[192698]:     <smbios mode="sysinfo"/>
Oct 01 14:09:33 compute-0 nova_compute[192698]:   </os>
Oct 01 14:09:33 compute-0 nova_compute[192698]:   <features>
Oct 01 14:09:33 compute-0 nova_compute[192698]:     <acpi/>
Oct 01 14:09:33 compute-0 nova_compute[192698]:     <apic/>
Oct 01 14:09:33 compute-0 nova_compute[192698]:     <vmcoreinfo/>
Oct 01 14:09:33 compute-0 nova_compute[192698]:   </features>
Oct 01 14:09:33 compute-0 nova_compute[192698]:   <clock offset="utc">
Oct 01 14:09:33 compute-0 nova_compute[192698]:     <timer name="pit" tickpolicy="delay"/>
Oct 01 14:09:33 compute-0 nova_compute[192698]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 01 14:09:33 compute-0 nova_compute[192698]:     <timer name="hpet" present="no"/>
Oct 01 14:09:33 compute-0 nova_compute[192698]:   </clock>
Oct 01 14:09:33 compute-0 nova_compute[192698]:   <cpu mode="host-model" match="exact">
Oct 01 14:09:33 compute-0 nova_compute[192698]:     <topology sockets="1" cores="1" threads="1"/>
Oct 01 14:09:33 compute-0 nova_compute[192698]:   </cpu>
Oct 01 14:09:33 compute-0 nova_compute[192698]:   <devices>
Oct 01 14:09:33 compute-0 nova_compute[192698]:     <disk type="file" device="disk">
Oct 01 14:09:33 compute-0 nova_compute[192698]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 01 14:09:33 compute-0 nova_compute[192698]:       <source file="/var/lib/nova/instances/ff58d640-84a3-4709-9a4a-084f3deaac0c/disk"/>
Oct 01 14:09:33 compute-0 nova_compute[192698]:       <target dev="vda" bus="virtio"/>
Oct 01 14:09:33 compute-0 nova_compute[192698]:     </disk>
Oct 01 14:09:33 compute-0 nova_compute[192698]:     <disk type="file" device="cdrom">
Oct 01 14:09:33 compute-0 nova_compute[192698]:       <driver name="qemu" type="raw" cache="none"/>
Oct 01 14:09:33 compute-0 nova_compute[192698]:       <source file="/var/lib/nova/instances/ff58d640-84a3-4709-9a4a-084f3deaac0c/disk.config"/>
Oct 01 14:09:33 compute-0 nova_compute[192698]:       <target dev="sda" bus="sata"/>
Oct 01 14:09:33 compute-0 nova_compute[192698]:     </disk>
Oct 01 14:09:33 compute-0 nova_compute[192698]:     <interface type="ethernet">
Oct 01 14:09:33 compute-0 nova_compute[192698]:       <mac address="fa:16:3e:47:d6:1f"/>
Oct 01 14:09:33 compute-0 nova_compute[192698]:       <model type="virtio"/>
Oct 01 14:09:33 compute-0 nova_compute[192698]:       <driver name="vhost" rx_queue_size="512"/>
Oct 01 14:09:33 compute-0 nova_compute[192698]:       <mtu size="1442"/>
Oct 01 14:09:33 compute-0 nova_compute[192698]:       <target dev="tapc8f94db4-b8"/>
Oct 01 14:09:33 compute-0 nova_compute[192698]:     </interface>
Oct 01 14:09:33 compute-0 nova_compute[192698]:     <serial type="pty">
Oct 01 14:09:33 compute-0 nova_compute[192698]:       <log file="/var/lib/nova/instances/ff58d640-84a3-4709-9a4a-084f3deaac0c/console.log" append="off"/>
Oct 01 14:09:33 compute-0 nova_compute[192698]:     </serial>
Oct 01 14:09:33 compute-0 nova_compute[192698]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 01 14:09:33 compute-0 nova_compute[192698]:     <video>
Oct 01 14:09:33 compute-0 nova_compute[192698]:       <model type="virtio"/>
Oct 01 14:09:33 compute-0 nova_compute[192698]:     </video>
Oct 01 14:09:33 compute-0 nova_compute[192698]:     <input type="tablet" bus="usb"/>
Oct 01 14:09:33 compute-0 nova_compute[192698]:     <rng model="virtio">
Oct 01 14:09:33 compute-0 nova_compute[192698]:       <backend model="random">/dev/urandom</backend>
Oct 01 14:09:33 compute-0 nova_compute[192698]:     </rng>
Oct 01 14:09:33 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root"/>
Oct 01 14:09:33 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:09:33 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:09:33 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:09:33 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:09:33 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:09:33 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:09:33 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:09:33 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:09:33 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:09:33 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:09:33 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:09:33 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:09:33 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:09:33 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:09:33 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:09:33 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:09:33 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:09:33 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:09:33 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:09:33 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:09:33 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:09:33 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:09:33 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:09:33 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:09:33 compute-0 nova_compute[192698]:     <controller type="usb" index="0"/>
Oct 01 14:09:33 compute-0 nova_compute[192698]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Oct 01 14:09:33 compute-0 nova_compute[192698]:       <stats period="10"/>
Oct 01 14:09:33 compute-0 nova_compute[192698]:     </memballoon>
Oct 01 14:09:33 compute-0 nova_compute[192698]:   </devices>
Oct 01 14:09:33 compute-0 nova_compute[192698]: </domain>
Oct 01 14:09:33 compute-0 nova_compute[192698]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Oct 01 14:09:33 compute-0 nova_compute[192698]: 2025-10-01 14:09:33.254 2 DEBUG nova.compute.manager [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Preparing to wait for external event network-vif-plugged-c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Oct 01 14:09:33 compute-0 nova_compute[192698]: 2025-10-01 14:09:33.255 2 DEBUG oslo_concurrency.lockutils [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] Acquiring lock "ff58d640-84a3-4709-9a4a-084f3deaac0c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:09:33 compute-0 nova_compute[192698]: 2025-10-01 14:09:33.255 2 DEBUG oslo_concurrency.lockutils [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] Lock "ff58d640-84a3-4709-9a4a-084f3deaac0c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:09:33 compute-0 nova_compute[192698]: 2025-10-01 14:09:33.256 2 DEBUG oslo_concurrency.lockutils [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] Lock "ff58d640-84a3-4709-9a4a-084f3deaac0c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:09:33 compute-0 nova_compute[192698]: 2025-10-01 14:09:33.257 2 DEBUG nova.virt.libvirt.vif [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-01T14:09:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-1583961141',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-1583961141',id=11,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6bf51d775b7c4c15a0326680d214c2bd',ramdisk_id='',reservation_id='r-xzdsyszl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteBasicStrategy-716451052',owner_user_name='tempest-TestExecuteBasicStrategy-716451052-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-01T14:09:26Z,user_data=None,user_id='fc564881007a4754ade24ed65141e269',uuid=ff58d640-84a3-4709-9a4a-084f3deaac0c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4", "address": "fa:16:3e:47:d6:1f", "network": {"id": "becc357a-665d-42a0-9440-5383962ecf85", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-159886170-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "066bb0cdf38a41b786fd15af0a2c834e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8f94db4-b8", "ovs_interfaceid": "c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 01 14:09:33 compute-0 nova_compute[192698]: 2025-10-01 14:09:33.257 2 DEBUG nova.network.os_vif_util [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] Converting VIF {"id": "c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4", "address": "fa:16:3e:47:d6:1f", "network": {"id": "becc357a-665d-42a0-9440-5383962ecf85", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-159886170-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "066bb0cdf38a41b786fd15af0a2c834e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8f94db4-b8", "ovs_interfaceid": "c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:09:33 compute-0 nova_compute[192698]: 2025-10-01 14:09:33.258 2 DEBUG nova.network.os_vif_util [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:47:d6:1f,bridge_name='br-int',has_traffic_filtering=True,id=c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4,network=Network(becc357a-665d-42a0-9440-5383962ecf85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8f94db4-b8') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:09:33 compute-0 nova_compute[192698]: 2025-10-01 14:09:33.259 2 DEBUG os_vif [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:47:d6:1f,bridge_name='br-int',has_traffic_filtering=True,id=c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4,network=Network(becc357a-665d-42a0-9440-5383962ecf85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8f94db4-b8') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 01 14:09:33 compute-0 nova_compute[192698]: 2025-10-01 14:09:33.260 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:09:33 compute-0 nova_compute[192698]: 2025-10-01 14:09:33.260 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:09:33 compute-0 nova_compute[192698]: 2025-10-01 14:09:33.261 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:09:33 compute-0 nova_compute[192698]: 2025-10-01 14:09:33.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:09:33 compute-0 nova_compute[192698]: 2025-10-01 14:09:33.262 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'ecc5cd00-b787-5e23-86e1-f9120e08f66f', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:09:33 compute-0 nova_compute[192698]: 2025-10-01 14:09:33.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:09:33 compute-0 nova_compute[192698]: 2025-10-01 14:09:33.265 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:09:33 compute-0 nova_compute[192698]: 2025-10-01 14:09:33.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:09:33 compute-0 nova_compute[192698]: 2025-10-01 14:09:33.272 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:09:33 compute-0 nova_compute[192698]: 2025-10-01 14:09:33.272 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc8f94db4-b8, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:09:33 compute-0 nova_compute[192698]: 2025-10-01 14:09:33.273 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapc8f94db4-b8, col_values=(('qos', UUID('c68abe7a-4b30-417b-aea5-3258cbe9594f')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:09:33 compute-0 nova_compute[192698]: 2025-10-01 14:09:33.273 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapc8f94db4-b8, col_values=(('external_ids', {'iface-id': 'c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:47:d6:1f', 'vm-uuid': 'ff58d640-84a3-4709-9a4a-084f3deaac0c'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:09:33 compute-0 nova_compute[192698]: 2025-10-01 14:09:33.275 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:09:33 compute-0 NetworkManager[51741]: <info>  [1759327773.2763] manager: (tapc8f94db4-b8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/38)
Oct 01 14:09:33 compute-0 nova_compute[192698]: 2025-10-01 14:09:33.278 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:09:33 compute-0 nova_compute[192698]: 2025-10-01 14:09:33.284 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:09:33 compute-0 nova_compute[192698]: 2025-10-01 14:09:33.285 2 INFO os_vif [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:47:d6:1f,bridge_name='br-int',has_traffic_filtering=True,id=c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4,network=Network(becc357a-665d-42a0-9440-5383962ecf85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8f94db4-b8')
Oct 01 14:09:34 compute-0 podman[218427]: 2025-10-01 14:09:34.155473787 +0000 UTC m=+0.070076226 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20250930, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 01 14:09:34 compute-0 podman[218428]: 2025-10-01 14:09:34.187852859 +0000 UTC m=+0.086941121 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct 01 14:09:34 compute-0 nova_compute[192698]: 2025-10-01 14:09:34.831 2 DEBUG nova.virt.libvirt.driver [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 01 14:09:34 compute-0 nova_compute[192698]: 2025-10-01 14:09:34.831 2 DEBUG nova.virt.libvirt.driver [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 01 14:09:34 compute-0 nova_compute[192698]: 2025-10-01 14:09:34.831 2 DEBUG nova.virt.libvirt.driver [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] No VIF found with MAC fa:16:3e:47:d6:1f, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Oct 01 14:09:34 compute-0 nova_compute[192698]: 2025-10-01 14:09:34.832 2 INFO nova.virt.libvirt.driver [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Using config drive
Oct 01 14:09:35 compute-0 nova_compute[192698]: 2025-10-01 14:09:35.345 2 WARNING neutronclient.v2_0.client [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:09:36 compute-0 nova_compute[192698]: 2025-10-01 14:09:36.031 2 INFO nova.virt.libvirt.driver [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Creating config drive at /var/lib/nova/instances/ff58d640-84a3-4709-9a4a-084f3deaac0c/disk.config
Oct 01 14:09:36 compute-0 nova_compute[192698]: 2025-10-01 14:09:36.042 2 DEBUG oslo_concurrency.processutils [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ff58d640-84a3-4709-9a4a-084f3deaac0c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpyq3ryp9k execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:09:36 compute-0 nova_compute[192698]: 2025-10-01 14:09:36.200 2 DEBUG oslo_concurrency.processutils [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ff58d640-84a3-4709-9a4a-084f3deaac0c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpyq3ryp9k" returned: 0 in 0.159s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:09:36 compute-0 kernel: tapc8f94db4-b8: entered promiscuous mode
Oct 01 14:09:36 compute-0 NetworkManager[51741]: <info>  [1759327776.2949] manager: (tapc8f94db4-b8): new Tun device (/org/freedesktop/NetworkManager/Devices/39)
Oct 01 14:09:36 compute-0 ovn_controller[94909]: 2025-10-01T14:09:36Z|00088|binding|INFO|Claiming lport c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4 for this chassis.
Oct 01 14:09:36 compute-0 nova_compute[192698]: 2025-10-01 14:09:36.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:09:36 compute-0 ovn_controller[94909]: 2025-10-01T14:09:36Z|00089|binding|INFO|c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4: Claiming fa:16:3e:47:d6:1f 10.100.0.8
Oct 01 14:09:36 compute-0 nova_compute[192698]: 2025-10-01 14:09:36.330 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:09:36 compute-0 systemd-udevd[218481]: Network interface NamePolicy= disabled on kernel command line.
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:09:36.341 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:47:d6:1f 10.100.0.8'], port_security=['fa:16:3e:47:d6:1f 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'ff58d640-84a3-4709-9a4a-084f3deaac0c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-becc357a-665d-42a0-9440-5383962ecf85', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6bf51d775b7c4c15a0326680d214c2bd', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'bb26a45c-b473-4596-a233-6d94ca53d7db', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=122eb5c6-8eb5-4891-90ef-718c58d07d03, chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], logical_port=c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:09:36.343 103791 INFO neutron.agent.ovn.metadata.agent [-] Port c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4 in datapath becc357a-665d-42a0-9440-5383962ecf85 bound to our chassis
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:09:36.344 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network becc357a-665d-42a0-9440-5383962ecf85
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:09:36.358 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[e9b7fb99-707b-4a30-8dfe-16b0a3fbe0f3]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:09:36.359 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapbecc357a-61 in ovnmeta-becc357a-665d-42a0-9440-5383962ecf85 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Oct 01 14:09:36 compute-0 NetworkManager[51741]: <info>  [1759327776.3607] device (tapc8f94db4-b8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:09:36.361 214114 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapbecc357a-60 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:09:36.361 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[154130f7-c73d-4ad5-ac53-f81b5c4a2eee]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:09:36.363 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[e1136952-a2c6-4e94-82b1-cfb6dd6c2fee]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:09:36 compute-0 NetworkManager[51741]: <info>  [1759327776.3686] device (tapc8f94db4-b8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 01 14:09:36 compute-0 systemd-machined[152704]: New machine qemu-7-instance-0000000b.
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:09:36.380 103910 DEBUG oslo.privsep.daemon [-] privsep: reply[35432320-d35f-4779-8471-e6416c83ed14]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:09:36.408 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[c496d7db-c04d-42aa-92a5-44991783d96b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:09:36 compute-0 ovn_controller[94909]: 2025-10-01T14:09:36Z|00090|binding|INFO|Setting lport c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4 ovn-installed in OVS
Oct 01 14:09:36 compute-0 ovn_controller[94909]: 2025-10-01T14:09:36Z|00091|binding|INFO|Setting lport c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4 up in Southbound
Oct 01 14:09:36 compute-0 nova_compute[192698]: 2025-10-01 14:09:36.414 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:09:36 compute-0 systemd[1]: Started Virtual Machine qemu-7-instance-0000000b.
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:09:36.463 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[6d5ef609-7dee-464a-824a-6d1f682bd9d8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:09:36.471 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[84534cd9-a85e-40f2-ae18-31fad1428c47]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:09:36 compute-0 NetworkManager[51741]: <info>  [1759327776.4734] manager: (tapbecc357a-60): new Veth device (/org/freedesktop/NetworkManager/Devices/40)
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:09:36.518 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[a133b9d8-aee9-42b1-bd8b-b3fcc60aa602]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:09:36.524 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[31ddc4d0-a475-447e-aee2-12c7abb4ecc6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:09:36 compute-0 NetworkManager[51741]: <info>  [1759327776.5590] device (tapbecc357a-60): carrier: link connected
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:09:36.566 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[7cf5c58e-4179-4513-b171-9e330c2c92d2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:09:36.586 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[84c10403-d936-4bf9-a5b0-83e57a19cd72]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbecc357a-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f4:12:cf'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 413709, 'reachable_time': 29146, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218515, 'error': None, 'target': 'ovnmeta-becc357a-665d-42a0-9440-5383962ecf85', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:09:36.606 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[916603b7-4343-4a8f-8868-ed48e3954089]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef4:12cf'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 413709, 'tstamp': 413709}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218516, 'error': None, 'target': 'ovnmeta-becc357a-665d-42a0-9440-5383962ecf85', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:09:36.626 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[af0a6c2e-7f4f-4bc1-83a0-5192949a0998]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbecc357a-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f4:12:cf'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 413709, 'reachable_time': 29146, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218517, 'error': None, 'target': 'ovnmeta-becc357a-665d-42a0-9440-5383962ecf85', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:09:36.674 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[4e41fa2c-462b-4ff3-b86b-4d7a88e7df5c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:09:36.775 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[36ce1b9c-ffc1-4569-a246-f4b31eebd03c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:09:36.778 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbecc357a-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:09:36.778 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:09:36.779 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbecc357a-60, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:09:36 compute-0 nova_compute[192698]: 2025-10-01 14:09:36.782 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:09:36 compute-0 kernel: tapbecc357a-60: entered promiscuous mode
Oct 01 14:09:36 compute-0 NetworkManager[51741]: <info>  [1759327776.7838] manager: (tapbecc357a-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/41)
Oct 01 14:09:36 compute-0 nova_compute[192698]: 2025-10-01 14:09:36.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:09:36.787 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapbecc357a-60, col_values=(('external_ids', {'iface-id': 'c1b3cbbb-bd0b-4d63-8ed5-cbe5f193869a'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:09:36 compute-0 nova_compute[192698]: 2025-10-01 14:09:36.788 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:09:36 compute-0 ovn_controller[94909]: 2025-10-01T14:09:36Z|00092|binding|INFO|Releasing lport c1b3cbbb-bd0b-4d63-8ed5-cbe5f193869a from this chassis (sb_readonly=0)
Oct 01 14:09:36 compute-0 nova_compute[192698]: 2025-10-01 14:09:36.813 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:09:36.815 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[7095acc8-846d-4a3e-a2e1-7c5874843f7c]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:09:36.817 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/becc357a-665d-42a0-9440-5383962ecf85.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/becc357a-665d-42a0-9440-5383962ecf85.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:09:36.817 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/becc357a-665d-42a0-9440-5383962ecf85.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/becc357a-665d-42a0-9440-5383962ecf85.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:09:36.817 103791 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for becc357a-665d-42a0-9440-5383962ecf85 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:09:36.818 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/becc357a-665d-42a0-9440-5383962ecf85.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/becc357a-665d-42a0-9440-5383962ecf85.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:09:36.818 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[e41bb49b-05bc-4f1c-8cd0-8491c69c0704]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:09:36.819 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/becc357a-665d-42a0-9440-5383962ecf85.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/becc357a-665d-42a0-9440-5383962ecf85.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:09:36.819 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[a763dddb-7a77-4532-ad78-c4e58a6b4c39]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:09:36.820 103791 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]: global
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]:     log         /dev/log local0 debug
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]:     log-tag     haproxy-metadata-proxy-becc357a-665d-42a0-9440-5383962ecf85
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]:     user        root
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]:     group       root
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]:     maxconn     1024
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]:     pidfile     /var/lib/neutron/external/pids/becc357a-665d-42a0-9440-5383962ecf85.pid.haproxy
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]:     daemon
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]: 
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]: defaults
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]:     log global
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]:     mode http
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]:     option httplog
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]:     option dontlognull
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]:     option http-server-close
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]:     option forwardfor
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]:     retries                 3
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]:     timeout http-request    30s
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]:     timeout connect         30s
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]:     timeout client          32s
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]:     timeout server          32s
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]:     timeout http-keep-alive 30s
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]: 
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]: listen listener
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]:     bind 169.254.169.254:80
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]:     
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]:     server metadata /var/lib/neutron/metadata_proxy
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]: 
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]:     http-request add-header X-OVN-Network-ID becc357a-665d-42a0-9440-5383962ecf85
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Oct 01 14:09:36 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:09:36.821 103791 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-becc357a-665d-42a0-9440-5383962ecf85', 'env', 'PROCESS_TAG=haproxy-becc357a-665d-42a0-9440-5383962ecf85', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/becc357a-665d-42a0-9440-5383962ecf85.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Oct 01 14:09:37 compute-0 nova_compute[192698]: 2025-10-01 14:09:37.166 2 DEBUG nova.compute.manager [req-a1c25b65-a78d-4fad-84fb-f0d790bc1294 req-a7426a46-de60-4ec5-b559-2dbd58adcbac 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Received event network-vif-plugged-c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:09:37 compute-0 nova_compute[192698]: 2025-10-01 14:09:37.167 2 DEBUG oslo_concurrency.lockutils [req-a1c25b65-a78d-4fad-84fb-f0d790bc1294 req-a7426a46-de60-4ec5-b559-2dbd58adcbac 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "ff58d640-84a3-4709-9a4a-084f3deaac0c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:09:37 compute-0 nova_compute[192698]: 2025-10-01 14:09:37.167 2 DEBUG oslo_concurrency.lockutils [req-a1c25b65-a78d-4fad-84fb-f0d790bc1294 req-a7426a46-de60-4ec5-b559-2dbd58adcbac 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "ff58d640-84a3-4709-9a4a-084f3deaac0c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:09:37 compute-0 nova_compute[192698]: 2025-10-01 14:09:37.167 2 DEBUG oslo_concurrency.lockutils [req-a1c25b65-a78d-4fad-84fb-f0d790bc1294 req-a7426a46-de60-4ec5-b559-2dbd58adcbac 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "ff58d640-84a3-4709-9a4a-084f3deaac0c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:09:37 compute-0 nova_compute[192698]: 2025-10-01 14:09:37.167 2 DEBUG nova.compute.manager [req-a1c25b65-a78d-4fad-84fb-f0d790bc1294 req-a7426a46-de60-4ec5-b559-2dbd58adcbac 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Processing event network-vif-plugged-c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Oct 01 14:09:37 compute-0 podman[218556]: 2025-10-01 14:09:37.297215807 +0000 UTC m=+0.066146491 container create d3e66bc6169a83a80bfce3f4d8574352b82fb5d9d36be8f58882ba58598c48ff (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-becc357a-665d-42a0-9440-5383962ecf85, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct 01 14:09:37 compute-0 systemd[1]: Started libpod-conmon-d3e66bc6169a83a80bfce3f4d8574352b82fb5d9d36be8f58882ba58598c48ff.scope.
Oct 01 14:09:37 compute-0 podman[218556]: 2025-10-01 14:09:37.262541954 +0000 UTC m=+0.031472638 image pull 0c139338a67144a0d88e07ef5f38b20d3085af4a1586fd8115d3776c8f9c633c 38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Oct 01 14:09:37 compute-0 systemd[1]: Started libcrun container.
Oct 01 14:09:37 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1683e9ed7d9ef4a00f6de6000b1a8a8179acacd83f6d26b9201be1b2885f0a80/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 01 14:09:37 compute-0 podman[218556]: 2025-10-01 14:09:37.401891663 +0000 UTC m=+0.170822387 container init d3e66bc6169a83a80bfce3f4d8574352b82fb5d9d36be8f58882ba58598c48ff (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-becc357a-665d-42a0-9440-5383962ecf85, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4)
Oct 01 14:09:37 compute-0 podman[218556]: 2025-10-01 14:09:37.409275462 +0000 UTC m=+0.178206146 container start d3e66bc6169a83a80bfce3f4d8574352b82fb5d9d36be8f58882ba58598c48ff (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-becc357a-665d-42a0-9440-5383962ecf85, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.4)
Oct 01 14:09:37 compute-0 neutron-haproxy-ovnmeta-becc357a-665d-42a0-9440-5383962ecf85[218572]: [NOTICE]   (218576) : New worker (218578) forked
Oct 01 14:09:37 compute-0 neutron-haproxy-ovnmeta-becc357a-665d-42a0-9440-5383962ecf85[218572]: [NOTICE]   (218576) : Loading success.
Oct 01 14:09:37 compute-0 nova_compute[192698]: 2025-10-01 14:09:37.470 2 DEBUG nova.compute.manager [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Oct 01 14:09:37 compute-0 nova_compute[192698]: 2025-10-01 14:09:37.475 2 DEBUG nova.virt.libvirt.driver [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Oct 01 14:09:37 compute-0 nova_compute[192698]: 2025-10-01 14:09:37.480 2 INFO nova.virt.libvirt.driver [-] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Instance spawned successfully.
Oct 01 14:09:37 compute-0 nova_compute[192698]: 2025-10-01 14:09:37.481 2 DEBUG nova.virt.libvirt.driver [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Oct 01 14:09:37 compute-0 nova_compute[192698]: 2025-10-01 14:09:37.998 2 DEBUG nova.virt.libvirt.driver [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:09:37 compute-0 nova_compute[192698]: 2025-10-01 14:09:37.999 2 DEBUG nova.virt.libvirt.driver [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:09:38 compute-0 nova_compute[192698]: 2025-10-01 14:09:38.000 2 DEBUG nova.virt.libvirt.driver [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:09:38 compute-0 nova_compute[192698]: 2025-10-01 14:09:38.001 2 DEBUG nova.virt.libvirt.driver [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:09:38 compute-0 nova_compute[192698]: 2025-10-01 14:09:38.002 2 DEBUG nova.virt.libvirt.driver [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:09:38 compute-0 nova_compute[192698]: 2025-10-01 14:09:38.003 2 DEBUG nova.virt.libvirt.driver [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:09:38 compute-0 nova_compute[192698]: 2025-10-01 14:09:38.154 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:09:38 compute-0 nova_compute[192698]: 2025-10-01 14:09:38.276 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:09:38 compute-0 nova_compute[192698]: 2025-10-01 14:09:38.514 2 INFO nova.compute.manager [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Took 11.02 seconds to spawn the instance on the hypervisor.
Oct 01 14:09:38 compute-0 nova_compute[192698]: 2025-10-01 14:09:38.515 2 DEBUG nova.compute.manager [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 01 14:09:39 compute-0 nova_compute[192698]: 2025-10-01 14:09:39.057 2 INFO nova.compute.manager [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Took 16.31 seconds to build instance.
Oct 01 14:09:39 compute-0 nova_compute[192698]: 2025-10-01 14:09:39.224 2 DEBUG nova.compute.manager [req-c44bde73-a18b-459d-815f-a464bbfc4a52 req-7567a215-cebb-46c1-ab1b-387e7c29ff5a 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Received event network-vif-plugged-c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:09:39 compute-0 nova_compute[192698]: 2025-10-01 14:09:39.225 2 DEBUG oslo_concurrency.lockutils [req-c44bde73-a18b-459d-815f-a464bbfc4a52 req-7567a215-cebb-46c1-ab1b-387e7c29ff5a 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "ff58d640-84a3-4709-9a4a-084f3deaac0c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:09:39 compute-0 nova_compute[192698]: 2025-10-01 14:09:39.225 2 DEBUG oslo_concurrency.lockutils [req-c44bde73-a18b-459d-815f-a464bbfc4a52 req-7567a215-cebb-46c1-ab1b-387e7c29ff5a 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "ff58d640-84a3-4709-9a4a-084f3deaac0c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:09:39 compute-0 nova_compute[192698]: 2025-10-01 14:09:39.226 2 DEBUG oslo_concurrency.lockutils [req-c44bde73-a18b-459d-815f-a464bbfc4a52 req-7567a215-cebb-46c1-ab1b-387e7c29ff5a 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "ff58d640-84a3-4709-9a4a-084f3deaac0c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:09:39 compute-0 nova_compute[192698]: 2025-10-01 14:09:39.226 2 DEBUG nova.compute.manager [req-c44bde73-a18b-459d-815f-a464bbfc4a52 req-7567a215-cebb-46c1-ab1b-387e7c29ff5a 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] No waiting events found dispatching network-vif-plugged-c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:09:39 compute-0 nova_compute[192698]: 2025-10-01 14:09:39.227 2 WARNING nova.compute.manager [req-c44bde73-a18b-459d-815f-a464bbfc4a52 req-7567a215-cebb-46c1-ab1b-387e7c29ff5a 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Received unexpected event network-vif-plugged-c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4 for instance with vm_state active and task_state None.
Oct 01 14:09:39 compute-0 nova_compute[192698]: 2025-10-01 14:09:39.568 2 DEBUG oslo_concurrency.lockutils [None req-326ae1da-581a-4652-9fda-f0aada22be69 fc564881007a4754ade24ed65141e269 6bf51d775b7c4c15a0326680d214c2bd - - default default] Lock "ff58d640-84a3-4709-9a4a-084f3deaac0c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.837s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:09:41 compute-0 podman[218587]: 2025-10-01 14:09:41.186649545 +0000 UTC m=+0.098595614 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 01 14:09:43 compute-0 nova_compute[192698]: 2025-10-01 14:09:43.157 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:09:43 compute-0 nova_compute[192698]: 2025-10-01 14:09:43.281 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:09:48 compute-0 nova_compute[192698]: 2025-10-01 14:09:48.162 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:09:48 compute-0 nova_compute[192698]: 2025-10-01 14:09:48.282 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:09:50 compute-0 ovn_controller[94909]: 2025-10-01T14:09:50Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:47:d6:1f 10.100.0.8
Oct 01 14:09:50 compute-0 ovn_controller[94909]: 2025-10-01T14:09:50Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:47:d6:1f 10.100.0.8
Oct 01 14:09:51 compute-0 podman[218625]: 2025-10-01 14:09:51.156915725 +0000 UTC m=+0.067818136 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Oct 01 14:09:51 compute-0 podman[218626]: 2025-10-01 14:09:51.227505973 +0000 UTC m=+0.136644747 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Oct 01 14:09:53 compute-0 nova_compute[192698]: 2025-10-01 14:09:53.165 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:09:53 compute-0 nova_compute[192698]: 2025-10-01 14:09:53.285 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:09:58 compute-0 nova_compute[192698]: 2025-10-01 14:09:58.168 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:09:58 compute-0 nova_compute[192698]: 2025-10-01 14:09:58.286 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:09:59 compute-0 podman[218672]: 2025-10-01 14:09:59.173711102 +0000 UTC m=+0.085165163 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.buildah.version=1.33.7, name=ubi9-minimal, release=1755695350, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, config_id=edpm, distribution-scope=public, architecture=x86_64, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers)
Oct 01 14:09:59 compute-0 podman[203144]: time="2025-10-01T14:09:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:09:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:09:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20750 "" "Go-http-client/1.1"
Oct 01 14:09:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:09:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3483 "" "Go-http-client/1.1"
Oct 01 14:10:01 compute-0 openstack_network_exporter[205307]: ERROR   14:10:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:10:01 compute-0 openstack_network_exporter[205307]: ERROR   14:10:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:10:01 compute-0 openstack_network_exporter[205307]: ERROR   14:10:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:10:01 compute-0 openstack_network_exporter[205307]: ERROR   14:10:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:10:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:10:01 compute-0 openstack_network_exporter[205307]: ERROR   14:10:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:10:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:10:02 compute-0 nova_compute[192698]: 2025-10-01 14:10:02.075 2 DEBUG nova.compute.manager [None req-c4bce767-5713-4bed-9851-4bcba90881e8 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider ee1e54f5-453b-4949-a499-9a192f03b8f0 in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:635
Oct 01 14:10:02 compute-0 nova_compute[192698]: 2025-10-01 14:10:02.120 2 DEBUG nova.compute.provider_tree [None req-c4bce767-5713-4bed-9851-4bcba90881e8 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Updating resource provider ee1e54f5-453b-4949-a499-9a192f03b8f0 generation from 9 to 11 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Oct 01 14:10:03 compute-0 nova_compute[192698]: 2025-10-01 14:10:03.172 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:10:03 compute-0 nova_compute[192698]: 2025-10-01 14:10:03.290 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:10:05 compute-0 podman[218694]: 2025-10-01 14:10:05.160829325 +0000 UTC m=+0.076116179 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest)
Oct 01 14:10:05 compute-0 podman[218695]: 2025-10-01 14:10:05.184627985 +0000 UTC m=+0.090049714 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, tcib_build_tag=watcher_latest, tcib_managed=true)
Oct 01 14:10:06 compute-0 ovn_controller[94909]: 2025-10-01T14:10:06Z|00093|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Oct 01 14:10:07 compute-0 nova_compute[192698]: 2025-10-01 14:10:07.927 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:10:08 compute-0 nova_compute[192698]: 2025-10-01 14:10:08.175 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:10:08 compute-0 nova_compute[192698]: 2025-10-01 14:10:08.293 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:10:09 compute-0 nova_compute[192698]: 2025-10-01 14:10:09.755 2 DEBUG nova.virt.libvirt.driver [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Check if temp file /var/lib/nova/instances/tmpa2navyg5 exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Oct 01 14:10:09 compute-0 nova_compute[192698]: 2025-10-01 14:10:09.762 2 DEBUG nova.compute.manager [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpa2navyg5',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='ff58d640-84a3-4709-9a4a-084f3deaac0c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9294
Oct 01 14:10:10 compute-0 nova_compute[192698]: 2025-10-01 14:10:10.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:10:11 compute-0 nova_compute[192698]: 2025-10-01 14:10:11.445 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:10:11 compute-0 nova_compute[192698]: 2025-10-01 14:10:11.446 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:10:11 compute-0 nova_compute[192698]: 2025-10-01 14:10:11.446 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:10:11 compute-0 nova_compute[192698]: 2025-10-01 14:10:11.446 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 01 14:10:11 compute-0 podman[218735]: 2025-10-01 14:10:11.58728713 +0000 UTC m=+0.085969594 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 01 14:10:12 compute-0 nova_compute[192698]: 2025-10-01 14:10:12.528 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ff58d640-84a3-4709-9a4a-084f3deaac0c/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:10:12 compute-0 nova_compute[192698]: 2025-10-01 14:10:12.620 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ff58d640-84a3-4709-9a4a-084f3deaac0c/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:10:12 compute-0 nova_compute[192698]: 2025-10-01 14:10:12.622 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ff58d640-84a3-4709-9a4a-084f3deaac0c/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:10:12 compute-0 nova_compute[192698]: 2025-10-01 14:10:12.719 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ff58d640-84a3-4709-9a4a-084f3deaac0c/disk --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:10:12 compute-0 nova_compute[192698]: 2025-10-01 14:10:12.968 2 WARNING nova.virt.libvirt.driver [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:10:12 compute-0 nova_compute[192698]: 2025-10-01 14:10:12.970 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:10:13 compute-0 nova_compute[192698]: 2025-10-01 14:10:13.004 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.034s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:10:13 compute-0 nova_compute[192698]: 2025-10-01 14:10:13.006 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5658MB free_disk=73.27649688720703GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 01 14:10:13 compute-0 nova_compute[192698]: 2025-10-01 14:10:13.007 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:10:13 compute-0 nova_compute[192698]: 2025-10-01 14:10:13.007 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:10:13 compute-0 nova_compute[192698]: 2025-10-01 14:10:13.177 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:10:13 compute-0 nova_compute[192698]: 2025-10-01 14:10:13.295 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:10:14 compute-0 nova_compute[192698]: 2025-10-01 14:10:14.028 2 INFO nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Updating resource usage from migration 95d4b164-ecb5-485b-b5eb-37e5147e6ed1
Oct 01 14:10:14 compute-0 nova_compute[192698]: 2025-10-01 14:10:14.064 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Migration 95d4b164-ecb5-485b-b5eb-37e5147e6ed1 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Oct 01 14:10:14 compute-0 nova_compute[192698]: 2025-10-01 14:10:14.065 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 01 14:10:14 compute-0 nova_compute[192698]: 2025-10-01 14:10:14.065 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:10:13 up  1:09,  0 user,  load average: 0.18, 0.33, 0.45\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_migrating': '1', 'num_os_type_None': '1', 'num_proj_6bf51d775b7c4c15a0326680d214c2bd': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 01 14:10:14 compute-0 nova_compute[192698]: 2025-10-01 14:10:14.108 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:10:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:10:14.246 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:10:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:10:14.247 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:10:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:10:14.248 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:10:14 compute-0 nova_compute[192698]: 2025-10-01 14:10:14.300 2 DEBUG oslo_concurrency.processutils [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ff58d640-84a3-4709-9a4a-084f3deaac0c/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:10:14 compute-0 nova_compute[192698]: 2025-10-01 14:10:14.365 2 DEBUG oslo_concurrency.processutils [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ff58d640-84a3-4709-9a4a-084f3deaac0c/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:10:14 compute-0 nova_compute[192698]: 2025-10-01 14:10:14.366 2 DEBUG oslo_concurrency.processutils [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ff58d640-84a3-4709-9a4a-084f3deaac0c/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:10:14 compute-0 nova_compute[192698]: 2025-10-01 14:10:14.462 2 DEBUG oslo_concurrency.processutils [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ff58d640-84a3-4709-9a4a-084f3deaac0c/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:10:14 compute-0 nova_compute[192698]: 2025-10-01 14:10:14.464 2 DEBUG nova.compute.manager [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Preparing to wait for external event network-vif-plugged-c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Oct 01 14:10:14 compute-0 nova_compute[192698]: 2025-10-01 14:10:14.464 2 DEBUG oslo_concurrency.lockutils [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "ff58d640-84a3-4709-9a4a-084f3deaac0c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:10:14 compute-0 nova_compute[192698]: 2025-10-01 14:10:14.465 2 DEBUG oslo_concurrency.lockutils [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "ff58d640-84a3-4709-9a4a-084f3deaac0c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:10:14 compute-0 nova_compute[192698]: 2025-10-01 14:10:14.465 2 DEBUG oslo_concurrency.lockutils [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "ff58d640-84a3-4709-9a4a-084f3deaac0c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:10:14 compute-0 nova_compute[192698]: 2025-10-01 14:10:14.616 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:10:15 compute-0 nova_compute[192698]: 2025-10-01 14:10:15.130 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 01 14:10:15 compute-0 nova_compute[192698]: 2025-10-01 14:10:15.130 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.123s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:10:16 compute-0 nova_compute[192698]: 2025-10-01 14:10:16.131 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:10:16 compute-0 nova_compute[192698]: 2025-10-01 14:10:16.132 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:10:16 compute-0 nova_compute[192698]: 2025-10-01 14:10:16.132 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:10:16 compute-0 nova_compute[192698]: 2025-10-01 14:10:16.132 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:10:16 compute-0 nova_compute[192698]: 2025-10-01 14:10:16.132 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:10:18 compute-0 nova_compute[192698]: 2025-10-01 14:10:18.180 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:10:18 compute-0 nova_compute[192698]: 2025-10-01 14:10:18.298 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:10:18 compute-0 nova_compute[192698]: 2025-10-01 14:10:18.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:10:18 compute-0 nova_compute[192698]: 2025-10-01 14:10:18.926 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 01 14:10:20 compute-0 nova_compute[192698]: 2025-10-01 14:10:20.428 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:10:20 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:10:20.428 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'e2:3f:3c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:1d:a6:67:ed:e6'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:10:20 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:10:20.430 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 01 14:10:20 compute-0 nova_compute[192698]: 2025-10-01 14:10:20.434 2 DEBUG nova.compute.manager [req-0785f9d4-5991-45ad-9e6d-00c699df696a req-7251f661-2003-4a85-9387-68ddf7342a19 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Received event network-vif-unplugged-c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:10:20 compute-0 nova_compute[192698]: 2025-10-01 14:10:20.434 2 DEBUG oslo_concurrency.lockutils [req-0785f9d4-5991-45ad-9e6d-00c699df696a req-7251f661-2003-4a85-9387-68ddf7342a19 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "ff58d640-84a3-4709-9a4a-084f3deaac0c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:10:20 compute-0 nova_compute[192698]: 2025-10-01 14:10:20.434 2 DEBUG oslo_concurrency.lockutils [req-0785f9d4-5991-45ad-9e6d-00c699df696a req-7251f661-2003-4a85-9387-68ddf7342a19 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "ff58d640-84a3-4709-9a4a-084f3deaac0c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:10:20 compute-0 nova_compute[192698]: 2025-10-01 14:10:20.434 2 DEBUG oslo_concurrency.lockutils [req-0785f9d4-5991-45ad-9e6d-00c699df696a req-7251f661-2003-4a85-9387-68ddf7342a19 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "ff58d640-84a3-4709-9a4a-084f3deaac0c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:10:20 compute-0 nova_compute[192698]: 2025-10-01 14:10:20.435 2 DEBUG nova.compute.manager [req-0785f9d4-5991-45ad-9e6d-00c699df696a req-7251f661-2003-4a85-9387-68ddf7342a19 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] No event matching network-vif-unplugged-c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4 in dict_keys([('network-vif-plugged', 'c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:349
Oct 01 14:10:20 compute-0 nova_compute[192698]: 2025-10-01 14:10:20.435 2 DEBUG nova.compute.manager [req-0785f9d4-5991-45ad-9e6d-00c699df696a req-7251f661-2003-4a85-9387-68ddf7342a19 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Received event network-vif-unplugged-c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 01 14:10:21 compute-0 nova_compute[192698]: 2025-10-01 14:10:21.492 2 INFO nova.compute.manager [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Took 7.03 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Oct 01 14:10:22 compute-0 podman[218774]: 2025-10-01 14:10:22.190268019 +0000 UTC m=+0.088084481 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930)
Oct 01 14:10:22 compute-0 podman[218775]: 2025-10-01 14:10:22.2144416 +0000 UTC m=+0.110871255 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.build-date=20250930, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.4)
Oct 01 14:10:22 compute-0 nova_compute[192698]: 2025-10-01 14:10:22.507 2 DEBUG nova.compute.manager [req-10f6a5fc-2d87-4cd6-824d-0a3fb6cc20d0 req-992b5b3c-f201-4e1d-8a7e-3f44e2060813 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Received event network-vif-plugged-c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:10:22 compute-0 nova_compute[192698]: 2025-10-01 14:10:22.508 2 DEBUG oslo_concurrency.lockutils [req-10f6a5fc-2d87-4cd6-824d-0a3fb6cc20d0 req-992b5b3c-f201-4e1d-8a7e-3f44e2060813 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "ff58d640-84a3-4709-9a4a-084f3deaac0c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:10:22 compute-0 nova_compute[192698]: 2025-10-01 14:10:22.508 2 DEBUG oslo_concurrency.lockutils [req-10f6a5fc-2d87-4cd6-824d-0a3fb6cc20d0 req-992b5b3c-f201-4e1d-8a7e-3f44e2060813 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "ff58d640-84a3-4709-9a4a-084f3deaac0c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:10:22 compute-0 nova_compute[192698]: 2025-10-01 14:10:22.508 2 DEBUG oslo_concurrency.lockutils [req-10f6a5fc-2d87-4cd6-824d-0a3fb6cc20d0 req-992b5b3c-f201-4e1d-8a7e-3f44e2060813 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "ff58d640-84a3-4709-9a4a-084f3deaac0c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:10:22 compute-0 nova_compute[192698]: 2025-10-01 14:10:22.508 2 DEBUG nova.compute.manager [req-10f6a5fc-2d87-4cd6-824d-0a3fb6cc20d0 req-992b5b3c-f201-4e1d-8a7e-3f44e2060813 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Processing event network-vif-plugged-c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Oct 01 14:10:22 compute-0 nova_compute[192698]: 2025-10-01 14:10:22.508 2 DEBUG nova.compute.manager [req-10f6a5fc-2d87-4cd6-824d-0a3fb6cc20d0 req-992b5b3c-f201-4e1d-8a7e-3f44e2060813 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Received event network-changed-c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:10:22 compute-0 nova_compute[192698]: 2025-10-01 14:10:22.508 2 DEBUG nova.compute.manager [req-10f6a5fc-2d87-4cd6-824d-0a3fb6cc20d0 req-992b5b3c-f201-4e1d-8a7e-3f44e2060813 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Refreshing instance network info cache due to event network-changed-c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Oct 01 14:10:22 compute-0 nova_compute[192698]: 2025-10-01 14:10:22.509 2 DEBUG oslo_concurrency.lockutils [req-10f6a5fc-2d87-4cd6-824d-0a3fb6cc20d0 req-992b5b3c-f201-4e1d-8a7e-3f44e2060813 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "refresh_cache-ff58d640-84a3-4709-9a4a-084f3deaac0c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 01 14:10:22 compute-0 nova_compute[192698]: 2025-10-01 14:10:22.509 2 DEBUG oslo_concurrency.lockutils [req-10f6a5fc-2d87-4cd6-824d-0a3fb6cc20d0 req-992b5b3c-f201-4e1d-8a7e-3f44e2060813 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquired lock "refresh_cache-ff58d640-84a3-4709-9a4a-084f3deaac0c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 01 14:10:22 compute-0 nova_compute[192698]: 2025-10-01 14:10:22.509 2 DEBUG nova.network.neutron [req-10f6a5fc-2d87-4cd6-824d-0a3fb6cc20d0 req-992b5b3c-f201-4e1d-8a7e-3f44e2060813 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Refreshing network info cache for port c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Oct 01 14:10:22 compute-0 nova_compute[192698]: 2025-10-01 14:10:22.510 2 DEBUG nova.compute.manager [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Oct 01 14:10:23 compute-0 nova_compute[192698]: 2025-10-01 14:10:23.016 2 WARNING neutronclient.v2_0.client [req-10f6a5fc-2d87-4cd6-824d-0a3fb6cc20d0 req-992b5b3c-f201-4e1d-8a7e-3f44e2060813 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:10:23 compute-0 nova_compute[192698]: 2025-10-01 14:10:23.021 2 DEBUG nova.compute.manager [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpa2navyg5',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='ff58d640-84a3-4709-9a4a-084f3deaac0c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(95d4b164-ecb5-485b-b5eb-37e5147e6ed1),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9659
Oct 01 14:10:23 compute-0 nova_compute[192698]: 2025-10-01 14:10:23.181 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:10:23 compute-0 nova_compute[192698]: 2025-10-01 14:10:23.301 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:10:23 compute-0 nova_compute[192698]: 2025-10-01 14:10:23.539 2 DEBUG nova.objects.instance [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lazy-loading 'migration_context' on Instance uuid ff58d640-84a3-4709-9a4a-084f3deaac0c obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:10:23 compute-0 nova_compute[192698]: 2025-10-01 14:10:23.540 2 DEBUG nova.virt.libvirt.driver [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Oct 01 14:10:23 compute-0 nova_compute[192698]: 2025-10-01 14:10:23.542 2 DEBUG nova.virt.libvirt.driver [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Oct 01 14:10:23 compute-0 nova_compute[192698]: 2025-10-01 14:10:23.542 2 DEBUG nova.virt.libvirt.driver [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Oct 01 14:10:24 compute-0 nova_compute[192698]: 2025-10-01 14:10:24.045 2 DEBUG nova.virt.libvirt.driver [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Oct 01 14:10:24 compute-0 nova_compute[192698]: 2025-10-01 14:10:24.046 2 DEBUG nova.virt.libvirt.driver [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Oct 01 14:10:24 compute-0 nova_compute[192698]: 2025-10-01 14:10:24.054 2 DEBUG nova.virt.libvirt.vif [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-10-01T14:09:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-1583961141',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-1583961141',id=11,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-01T14:09:38Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='6bf51d775b7c4c15a0326680d214c2bd',ramdisk_id='',reservation_id='r-xzdsyszl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteBasicStrategy-716451052',owner_user_name='tempest-TestExecuteBasicStrategy-716451052-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-01T14:09:38Z,user_data=None,user_id='fc564881007a4754ade24ed65141e269',uuid=ff58d640-84a3-4709-9a4a-084f3deaac0c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4", "address": "fa:16:3e:47:d6:1f", "network": {"id": "becc357a-665d-42a0-9440-5383962ecf85", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-159886170-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "066bb0cdf38a41b786fd15af0a2c834e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapc8f94db4-b8", "ovs_interfaceid": "c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Oct 01 14:10:24 compute-0 nova_compute[192698]: 2025-10-01 14:10:24.054 2 DEBUG nova.network.os_vif_util [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Converting VIF {"id": "c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4", "address": "fa:16:3e:47:d6:1f", "network": {"id": "becc357a-665d-42a0-9440-5383962ecf85", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-159886170-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "066bb0cdf38a41b786fd15af0a2c834e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapc8f94db4-b8", "ovs_interfaceid": "c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:10:24 compute-0 nova_compute[192698]: 2025-10-01 14:10:24.056 2 DEBUG nova.network.os_vif_util [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:47:d6:1f,bridge_name='br-int',has_traffic_filtering=True,id=c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4,network=Network(becc357a-665d-42a0-9440-5383962ecf85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8f94db4-b8') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:10:24 compute-0 nova_compute[192698]: 2025-10-01 14:10:24.057 2 DEBUG nova.virt.libvirt.migration [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Updating guest XML with vif config: <interface type="ethernet">
Oct 01 14:10:24 compute-0 nova_compute[192698]:   <mac address="fa:16:3e:47:d6:1f"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   <model type="virtio"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   <driver name="vhost" rx_queue_size="512"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   <mtu size="1442"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   <target dev="tapc8f94db4-b8"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]: </interface>
Oct 01 14:10:24 compute-0 nova_compute[192698]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Oct 01 14:10:24 compute-0 nova_compute[192698]: 2025-10-01 14:10:24.058 2 DEBUG nova.virt.libvirt.migration [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Oct 01 14:10:24 compute-0 nova_compute[192698]:   <name>instance-0000000b</name>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   <uuid>ff58d640-84a3-4709-9a4a-084f3deaac0c</uuid>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   <metadata>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <nova:name>tempest-TestExecuteBasicStrategy-server-1583961141</nova:name>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <nova:creationTime>2025-10-01 14:09:32</nova:creationTime>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <nova:flavor name="m1.nano" id="69702c4b-38f2-49d1-96d5-85671652c67e">
Oct 01 14:10:24 compute-0 nova_compute[192698]:         <nova:memory>128</nova:memory>
Oct 01 14:10:24 compute-0 nova_compute[192698]:         <nova:disk>1</nova:disk>
Oct 01 14:10:24 compute-0 nova_compute[192698]:         <nova:swap>0</nova:swap>
Oct 01 14:10:24 compute-0 nova_compute[192698]:         <nova:ephemeral>0</nova:ephemeral>
Oct 01 14:10:24 compute-0 nova_compute[192698]:         <nova:vcpus>1</nova:vcpus>
Oct 01 14:10:24 compute-0 nova_compute[192698]:         <nova:extraSpecs>
Oct 01 14:10:24 compute-0 nova_compute[192698]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Oct 01 14:10:24 compute-0 nova_compute[192698]:         </nova:extraSpecs>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       </nova:flavor>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <nova:image uuid="48696e9b-a20d-4bf6-8ac2-6438fe748ab6">
Oct 01 14:10:24 compute-0 nova_compute[192698]:         <nova:containerFormat>bare</nova:containerFormat>
Oct 01 14:10:24 compute-0 nova_compute[192698]:         <nova:diskFormat>qcow2</nova:diskFormat>
Oct 01 14:10:24 compute-0 nova_compute[192698]:         <nova:minDisk>1</nova:minDisk>
Oct 01 14:10:24 compute-0 nova_compute[192698]:         <nova:minRam>0</nova:minRam>
Oct 01 14:10:24 compute-0 nova_compute[192698]:         <nova:properties>
Oct 01 14:10:24 compute-0 nova_compute[192698]:           <nova:property name="hw_rng_model">virtio</nova:property>
Oct 01 14:10:24 compute-0 nova_compute[192698]:         </nova:properties>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       </nova:image>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <nova:owner>
Oct 01 14:10:24 compute-0 nova_compute[192698]:         <nova:user uuid="fc564881007a4754ade24ed65141e269">tempest-TestExecuteBasicStrategy-716451052-project-admin</nova:user>
Oct 01 14:10:24 compute-0 nova_compute[192698]:         <nova:project uuid="6bf51d775b7c4c15a0326680d214c2bd">tempest-TestExecuteBasicStrategy-716451052</nova:project>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       </nova:owner>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <nova:root type="image" uuid="48696e9b-a20d-4bf6-8ac2-6438fe748ab6"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <nova:ports>
Oct 01 14:10:24 compute-0 nova_compute[192698]:         <nova:port uuid="c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4">
Oct 01 14:10:24 compute-0 nova_compute[192698]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:         </nova:port>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       </nova:ports>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </nova:instance>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   </metadata>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   <memory unit="KiB">131072</memory>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   <currentMemory unit="KiB">131072</currentMemory>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   <vcpu placement="static">1</vcpu>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   <resource>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <partition>/machine</partition>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   </resource>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   <sysinfo type="smbios">
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <system>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <entry name="manufacturer">RDO</entry>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <entry name="product">OpenStack Compute</entry>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <entry name="serial">ff58d640-84a3-4709-9a4a-084f3deaac0c</entry>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <entry name="uuid">ff58d640-84a3-4709-9a4a-084f3deaac0c</entry>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <entry name="family">Virtual Machine</entry>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </system>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   </sysinfo>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   <os>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <boot dev="hd"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <smbios mode="sysinfo"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   </os>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   <features>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <acpi/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <apic/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <vmcoreinfo state="on"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   </features>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   <cpu mode="host-model" check="partial">
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   </cpu>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   <clock offset="utc">
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <timer name="pit" tickpolicy="delay"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <timer name="hpet" present="no"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   </clock>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   <on_poweroff>destroy</on_poweroff>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   <on_reboot>restart</on_reboot>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   <on_crash>destroy</on_crash>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   <devices>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <disk type="file" device="disk">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <source file="/var/lib/nova/instances/ff58d640-84a3-4709-9a4a-084f3deaac0c/disk"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target dev="vda" bus="virtio"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </disk>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <disk type="file" device="cdrom">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <driver name="qemu" type="raw" cache="none"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <source file="/var/lib/nova/instances/ff58d640-84a3-4709-9a4a-084f3deaac0c/disk.config"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target dev="sda" bus="sata"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <readonly/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </disk>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="0" model="pcie-root"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="1" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="1" port="0x10"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="2" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="2" port="0x11"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="3" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="3" port="0x12"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="4" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="4" port="0x13"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="5" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="5" port="0x14"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="6" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="6" port="0x15"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="7" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="7" port="0x16"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="8" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="8" port="0x17"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="9" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="9" port="0x18"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="10" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="10" port="0x19"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="11" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="11" port="0x1a"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="12" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="12" port="0x1b"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="13" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="13" port="0x1c"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="14" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="14" port="0x1d"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="15" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="15" port="0x1e"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="16" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="16" port="0x1f"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="17" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="17" port="0x20"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="18" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="18" port="0x21"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="19" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="19" port="0x22"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="20" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="20" port="0x23"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="21" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="21" port="0x24"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="22" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="22" port="0x25"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="23" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="23" port="0x26"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="24" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="24" port="0x27"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="25" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="25" port="0x28"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-pci-bridge"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="usb" index="0" model="piix3-uhci">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="sata" index="0">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <interface type="ethernet"><mac address="fa:16:3e:47:d6:1f"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapc8f94db4-b8"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </interface><serial type="pty">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <log file="/var/lib/nova/instances/ff58d640-84a3-4709-9a4a-084f3deaac0c/console.log" append="off"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target type="isa-serial" port="0">
Oct 01 14:10:24 compute-0 nova_compute[192698]:         <model name="isa-serial"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       </target>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </serial>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <console type="pty">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <log file="/var/lib/nova/instances/ff58d640-84a3-4709-9a4a-084f3deaac0c/console.log" append="off"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target type="serial" port="0"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </console>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <input type="tablet" bus="usb">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="usb" bus="0" port="1"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </input>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <input type="mouse" bus="ps2"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <listen type="address" address="::"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </graphics>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <video>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model type="virtio" heads="1" primary="yes"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </video>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <stats period="10"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </memballoon>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <rng model="virtio">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <backend model="random">/dev/urandom</backend>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </rng>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   </devices>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]: </domain>
Oct 01 14:10:24 compute-0 nova_compute[192698]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Oct 01 14:10:24 compute-0 nova_compute[192698]: 2025-10-01 14:10:24.059 2 DEBUG nova.virt.libvirt.migration [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Oct 01 14:10:24 compute-0 nova_compute[192698]:   <name>instance-0000000b</name>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   <uuid>ff58d640-84a3-4709-9a4a-084f3deaac0c</uuid>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   <metadata>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <nova:name>tempest-TestExecuteBasicStrategy-server-1583961141</nova:name>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <nova:creationTime>2025-10-01 14:09:32</nova:creationTime>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <nova:flavor name="m1.nano" id="69702c4b-38f2-49d1-96d5-85671652c67e">
Oct 01 14:10:24 compute-0 nova_compute[192698]:         <nova:memory>128</nova:memory>
Oct 01 14:10:24 compute-0 nova_compute[192698]:         <nova:disk>1</nova:disk>
Oct 01 14:10:24 compute-0 nova_compute[192698]:         <nova:swap>0</nova:swap>
Oct 01 14:10:24 compute-0 nova_compute[192698]:         <nova:ephemeral>0</nova:ephemeral>
Oct 01 14:10:24 compute-0 nova_compute[192698]:         <nova:vcpus>1</nova:vcpus>
Oct 01 14:10:24 compute-0 nova_compute[192698]:         <nova:extraSpecs>
Oct 01 14:10:24 compute-0 nova_compute[192698]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Oct 01 14:10:24 compute-0 nova_compute[192698]:         </nova:extraSpecs>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       </nova:flavor>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <nova:image uuid="48696e9b-a20d-4bf6-8ac2-6438fe748ab6">
Oct 01 14:10:24 compute-0 nova_compute[192698]:         <nova:containerFormat>bare</nova:containerFormat>
Oct 01 14:10:24 compute-0 nova_compute[192698]:         <nova:diskFormat>qcow2</nova:diskFormat>
Oct 01 14:10:24 compute-0 nova_compute[192698]:         <nova:minDisk>1</nova:minDisk>
Oct 01 14:10:24 compute-0 nova_compute[192698]:         <nova:minRam>0</nova:minRam>
Oct 01 14:10:24 compute-0 nova_compute[192698]:         <nova:properties>
Oct 01 14:10:24 compute-0 nova_compute[192698]:           <nova:property name="hw_rng_model">virtio</nova:property>
Oct 01 14:10:24 compute-0 nova_compute[192698]:         </nova:properties>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       </nova:image>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <nova:owner>
Oct 01 14:10:24 compute-0 nova_compute[192698]:         <nova:user uuid="fc564881007a4754ade24ed65141e269">tempest-TestExecuteBasicStrategy-716451052-project-admin</nova:user>
Oct 01 14:10:24 compute-0 nova_compute[192698]:         <nova:project uuid="6bf51d775b7c4c15a0326680d214c2bd">tempest-TestExecuteBasicStrategy-716451052</nova:project>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       </nova:owner>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <nova:root type="image" uuid="48696e9b-a20d-4bf6-8ac2-6438fe748ab6"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <nova:ports>
Oct 01 14:10:24 compute-0 nova_compute[192698]:         <nova:port uuid="c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4">
Oct 01 14:10:24 compute-0 nova_compute[192698]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:         </nova:port>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       </nova:ports>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </nova:instance>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   </metadata>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   <memory unit="KiB">131072</memory>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   <currentMemory unit="KiB">131072</currentMemory>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   <vcpu placement="static">1</vcpu>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   <resource>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <partition>/machine</partition>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   </resource>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   <sysinfo type="smbios">
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <system>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <entry name="manufacturer">RDO</entry>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <entry name="product">OpenStack Compute</entry>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <entry name="serial">ff58d640-84a3-4709-9a4a-084f3deaac0c</entry>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <entry name="uuid">ff58d640-84a3-4709-9a4a-084f3deaac0c</entry>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <entry name="family">Virtual Machine</entry>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </system>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   </sysinfo>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   <os>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <boot dev="hd"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <smbios mode="sysinfo"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   </os>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   <features>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <acpi/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <apic/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <vmcoreinfo state="on"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   </features>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   <cpu mode="host-model" check="partial">
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   </cpu>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   <clock offset="utc">
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <timer name="pit" tickpolicy="delay"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <timer name="hpet" present="no"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   </clock>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   <on_poweroff>destroy</on_poweroff>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   <on_reboot>restart</on_reboot>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   <on_crash>destroy</on_crash>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   <devices>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <disk type="file" device="disk">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <source file="/var/lib/nova/instances/ff58d640-84a3-4709-9a4a-084f3deaac0c/disk"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target dev="vda" bus="virtio"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </disk>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <disk type="file" device="cdrom">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <driver name="qemu" type="raw" cache="none"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <source file="/var/lib/nova/instances/ff58d640-84a3-4709-9a4a-084f3deaac0c/disk.config"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target dev="sda" bus="sata"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <readonly/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </disk>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="0" model="pcie-root"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="1" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="1" port="0x10"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="2" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="2" port="0x11"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="3" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="3" port="0x12"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="4" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="4" port="0x13"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="5" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="5" port="0x14"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="6" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="6" port="0x15"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="7" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="7" port="0x16"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="8" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="8" port="0x17"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="9" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="9" port="0x18"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="10" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="10" port="0x19"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="11" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="11" port="0x1a"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="12" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="12" port="0x1b"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="13" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="13" port="0x1c"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="14" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="14" port="0x1d"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="15" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="15" port="0x1e"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="16" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="16" port="0x1f"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="17" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="17" port="0x20"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="18" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="18" port="0x21"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="19" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="19" port="0x22"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="20" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="20" port="0x23"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="21" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="21" port="0x24"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="22" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="22" port="0x25"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="23" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="23" port="0x26"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="24" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="24" port="0x27"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="25" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="25" port="0x28"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-pci-bridge"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="usb" index="0" model="piix3-uhci">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="sata" index="0">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <interface type="ethernet"><mac address="fa:16:3e:47:d6:1f"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapc8f94db4-b8"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </interface><serial type="pty">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <log file="/var/lib/nova/instances/ff58d640-84a3-4709-9a4a-084f3deaac0c/console.log" append="off"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target type="isa-serial" port="0">
Oct 01 14:10:24 compute-0 nova_compute[192698]:         <model name="isa-serial"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       </target>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </serial>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <console type="pty">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <log file="/var/lib/nova/instances/ff58d640-84a3-4709-9a4a-084f3deaac0c/console.log" append="off"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target type="serial" port="0"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </console>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <input type="tablet" bus="usb">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="usb" bus="0" port="1"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </input>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <input type="mouse" bus="ps2"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <listen type="address" address="::"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </graphics>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <video>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model type="virtio" heads="1" primary="yes"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </video>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <stats period="10"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </memballoon>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <rng model="virtio">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <backend model="random">/dev/urandom</backend>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </rng>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   </devices>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]: </domain>
Oct 01 14:10:24 compute-0 nova_compute[192698]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Oct 01 14:10:24 compute-0 nova_compute[192698]: 2025-10-01 14:10:24.059 2 DEBUG nova.virt.libvirt.migration [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] _update_pci_xml output xml=<domain type="kvm">
Oct 01 14:10:24 compute-0 nova_compute[192698]:   <name>instance-0000000b</name>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   <uuid>ff58d640-84a3-4709-9a4a-084f3deaac0c</uuid>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   <metadata>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <nova:name>tempest-TestExecuteBasicStrategy-server-1583961141</nova:name>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <nova:creationTime>2025-10-01 14:09:32</nova:creationTime>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <nova:flavor name="m1.nano" id="69702c4b-38f2-49d1-96d5-85671652c67e">
Oct 01 14:10:24 compute-0 nova_compute[192698]:         <nova:memory>128</nova:memory>
Oct 01 14:10:24 compute-0 nova_compute[192698]:         <nova:disk>1</nova:disk>
Oct 01 14:10:24 compute-0 nova_compute[192698]:         <nova:swap>0</nova:swap>
Oct 01 14:10:24 compute-0 nova_compute[192698]:         <nova:ephemeral>0</nova:ephemeral>
Oct 01 14:10:24 compute-0 nova_compute[192698]:         <nova:vcpus>1</nova:vcpus>
Oct 01 14:10:24 compute-0 nova_compute[192698]:         <nova:extraSpecs>
Oct 01 14:10:24 compute-0 nova_compute[192698]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Oct 01 14:10:24 compute-0 nova_compute[192698]:         </nova:extraSpecs>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       </nova:flavor>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <nova:image uuid="48696e9b-a20d-4bf6-8ac2-6438fe748ab6">
Oct 01 14:10:24 compute-0 nova_compute[192698]:         <nova:containerFormat>bare</nova:containerFormat>
Oct 01 14:10:24 compute-0 nova_compute[192698]:         <nova:diskFormat>qcow2</nova:diskFormat>
Oct 01 14:10:24 compute-0 nova_compute[192698]:         <nova:minDisk>1</nova:minDisk>
Oct 01 14:10:24 compute-0 nova_compute[192698]:         <nova:minRam>0</nova:minRam>
Oct 01 14:10:24 compute-0 nova_compute[192698]:         <nova:properties>
Oct 01 14:10:24 compute-0 nova_compute[192698]:           <nova:property name="hw_rng_model">virtio</nova:property>
Oct 01 14:10:24 compute-0 nova_compute[192698]:         </nova:properties>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       </nova:image>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <nova:owner>
Oct 01 14:10:24 compute-0 nova_compute[192698]:         <nova:user uuid="fc564881007a4754ade24ed65141e269">tempest-TestExecuteBasicStrategy-716451052-project-admin</nova:user>
Oct 01 14:10:24 compute-0 nova_compute[192698]:         <nova:project uuid="6bf51d775b7c4c15a0326680d214c2bd">tempest-TestExecuteBasicStrategy-716451052</nova:project>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       </nova:owner>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <nova:root type="image" uuid="48696e9b-a20d-4bf6-8ac2-6438fe748ab6"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <nova:ports>
Oct 01 14:10:24 compute-0 nova_compute[192698]:         <nova:port uuid="c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4">
Oct 01 14:10:24 compute-0 nova_compute[192698]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:         </nova:port>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       </nova:ports>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </nova:instance>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   </metadata>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   <memory unit="KiB">131072</memory>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   <currentMemory unit="KiB">131072</currentMemory>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   <vcpu placement="static">1</vcpu>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   <resource>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <partition>/machine</partition>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   </resource>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   <sysinfo type="smbios">
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <system>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <entry name="manufacturer">RDO</entry>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <entry name="product">OpenStack Compute</entry>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <entry name="serial">ff58d640-84a3-4709-9a4a-084f3deaac0c</entry>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <entry name="uuid">ff58d640-84a3-4709-9a4a-084f3deaac0c</entry>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <entry name="family">Virtual Machine</entry>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </system>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   </sysinfo>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   <os>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <boot dev="hd"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <smbios mode="sysinfo"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   </os>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   <features>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <acpi/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <apic/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <vmcoreinfo state="on"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   </features>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   <cpu mode="host-model" check="partial">
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   </cpu>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   <clock offset="utc">
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <timer name="pit" tickpolicy="delay"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <timer name="hpet" present="no"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   </clock>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   <on_poweroff>destroy</on_poweroff>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   <on_reboot>restart</on_reboot>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   <on_crash>destroy</on_crash>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   <devices>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <disk type="file" device="disk">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <source file="/var/lib/nova/instances/ff58d640-84a3-4709-9a4a-084f3deaac0c/disk"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target dev="vda" bus="virtio"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </disk>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <disk type="file" device="cdrom">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <driver name="qemu" type="raw" cache="none"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <source file="/var/lib/nova/instances/ff58d640-84a3-4709-9a4a-084f3deaac0c/disk.config"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target dev="sda" bus="sata"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <readonly/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </disk>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="0" model="pcie-root"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="1" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="1" port="0x10"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="2" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="2" port="0x11"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="3" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="3" port="0x12"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="4" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="4" port="0x13"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="5" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="5" port="0x14"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="6" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="6" port="0x15"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="7" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="7" port="0x16"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="8" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="8" port="0x17"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="9" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="9" port="0x18"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="10" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="10" port="0x19"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="11" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="11" port="0x1a"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="12" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="12" port="0x1b"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="13" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="13" port="0x1c"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="14" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="14" port="0x1d"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="15" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="15" port="0x1e"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="16" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="16" port="0x1f"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="17" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="17" port="0x20"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="18" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="18" port="0x21"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="19" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="19" port="0x22"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="20" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="20" port="0x23"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="21" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="21" port="0x24"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="22" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="22" port="0x25"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="23" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="23" port="0x26"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="24" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="24" port="0x27"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="25" model="pcie-root-port">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target chassis="25" port="0x28"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model name="pcie-pci-bridge"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="usb" index="0" model="piix3-uhci">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <controller type="sata" index="0">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <interface type="ethernet"><mac address="fa:16:3e:47:d6:1f"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapc8f94db4-b8"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </interface><serial type="pty">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <log file="/var/lib/nova/instances/ff58d640-84a3-4709-9a4a-084f3deaac0c/console.log" append="off"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target type="isa-serial" port="0">
Oct 01 14:10:24 compute-0 nova_compute[192698]:         <model name="isa-serial"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       </target>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </serial>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <console type="pty">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <log file="/var/lib/nova/instances/ff58d640-84a3-4709-9a4a-084f3deaac0c/console.log" append="off"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <target type="serial" port="0"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </console>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <input type="tablet" bus="usb">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="usb" bus="0" port="1"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </input>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <input type="mouse" bus="ps2"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <listen type="address" address="::"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </graphics>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <video>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <model type="virtio" heads="1" primary="yes"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </video>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <stats period="10"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </memballoon>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     <rng model="virtio">
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <backend model="random">/dev/urandom</backend>
Oct 01 14:10:24 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]:     </rng>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   </devices>
Oct 01 14:10:24 compute-0 nova_compute[192698]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Oct 01 14:10:24 compute-0 nova_compute[192698]: </domain>
Oct 01 14:10:24 compute-0 nova_compute[192698]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Oct 01 14:10:24 compute-0 nova_compute[192698]: 2025-10-01 14:10:24.060 2 DEBUG nova.virt.libvirt.driver [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Oct 01 14:10:24 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:10:24.431 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=10cf9814-09fa-4bad-879a-270f9b64eda3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:10:24 compute-0 nova_compute[192698]: 2025-10-01 14:10:24.548 2 DEBUG nova.virt.libvirt.migration [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Current None elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Oct 01 14:10:24 compute-0 nova_compute[192698]: 2025-10-01 14:10:24.549 2 INFO nova.virt.libvirt.migration [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Increasing downtime to 50 ms after 0 sec elapsed time
Oct 01 14:10:24 compute-0 nova_compute[192698]: 2025-10-01 14:10:24.554 2 WARNING neutronclient.v2_0.client [req-10f6a5fc-2d87-4cd6-824d-0a3fb6cc20d0 req-992b5b3c-f201-4e1d-8a7e-3f44e2060813 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:10:24 compute-0 nova_compute[192698]: 2025-10-01 14:10:24.735 2 DEBUG nova.network.neutron [req-10f6a5fc-2d87-4cd6-824d-0a3fb6cc20d0 req-992b5b3c-f201-4e1d-8a7e-3f44e2060813 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Updated VIF entry in instance network info cache for port c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Oct 01 14:10:24 compute-0 nova_compute[192698]: 2025-10-01 14:10:24.735 2 DEBUG nova.network.neutron [req-10f6a5fc-2d87-4cd6-824d-0a3fb6cc20d0 req-992b5b3c-f201-4e1d-8a7e-3f44e2060813 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Updating instance_info_cache with network_info: [{"id": "c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4", "address": "fa:16:3e:47:d6:1f", "network": {"id": "becc357a-665d-42a0-9440-5383962ecf85", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-159886170-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "066bb0cdf38a41b786fd15af0a2c834e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8f94db4-b8", "ovs_interfaceid": "c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:10:25 compute-0 nova_compute[192698]: 2025-10-01 14:10:25.242 2 DEBUG oslo_concurrency.lockutils [req-10f6a5fc-2d87-4cd6-824d-0a3fb6cc20d0 req-992b5b3c-f201-4e1d-8a7e-3f44e2060813 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Releasing lock "refresh_cache-ff58d640-84a3-4709-9a4a-084f3deaac0c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 01 14:10:25 compute-0 nova_compute[192698]: 2025-10-01 14:10:25.575 2 INFO nova.virt.libvirt.driver [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Migration running for 1 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Oct 01 14:10:26 compute-0 nova_compute[192698]: 2025-10-01 14:10:26.093 2 DEBUG nova.virt.libvirt.migration [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Oct 01 14:10:26 compute-0 nova_compute[192698]: 2025-10-01 14:10:26.094 2 DEBUG nova.virt.libvirt.migration [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Oct 01 14:10:26 compute-0 kernel: tapc8f94db4-b8 (unregistering): left promiscuous mode
Oct 01 14:10:26 compute-0 NetworkManager[51741]: <info>  [1759327826.2055] device (tapc8f94db4-b8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 01 14:10:26 compute-0 ovn_controller[94909]: 2025-10-01T14:10:26Z|00094|binding|INFO|Releasing lport c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4 from this chassis (sb_readonly=0)
Oct 01 14:10:26 compute-0 ovn_controller[94909]: 2025-10-01T14:10:26Z|00095|binding|INFO|Setting lport c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4 down in Southbound
Oct 01 14:10:26 compute-0 ovn_controller[94909]: 2025-10-01T14:10:26Z|00096|binding|INFO|Removing iface tapc8f94db4-b8 ovn-installed in OVS
Oct 01 14:10:26 compute-0 nova_compute[192698]: 2025-10-01 14:10:26.213 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:10:26 compute-0 nova_compute[192698]: 2025-10-01 14:10:26.216 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:10:26 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:10:26.225 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:47:d6:1f 10.100.0.8'], port_security=['fa:16:3e:47:d6:1f 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'd71f76a2-379d-402b-b590-797cbe777099'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'ff58d640-84a3-4709-9a4a-084f3deaac0c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-becc357a-665d-42a0-9440-5383962ecf85', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6bf51d775b7c4c15a0326680d214c2bd', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'bb26a45c-b473-4596-a233-6d94ca53d7db', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=122eb5c6-8eb5-4891-90ef-718c58d07d03, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], logical_port=c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:10:26 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:10:26.227 103791 INFO neutron.agent.ovn.metadata.agent [-] Port c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4 in datapath becc357a-665d-42a0-9440-5383962ecf85 unbound from our chassis
Oct 01 14:10:26 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:10:26.228 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network becc357a-665d-42a0-9440-5383962ecf85, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 01 14:10:26 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:10:26.231 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[5f909759-cbe7-4e9b-b04e-4ad54cac2772]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:10:26 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:10:26.232 103791 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-becc357a-665d-42a0-9440-5383962ecf85 namespace which is not needed anymore
Oct 01 14:10:26 compute-0 nova_compute[192698]: 2025-10-01 14:10:26.247 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:10:26 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000b.scope: Deactivated successfully.
Oct 01 14:10:26 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000b.scope: Consumed 15.607s CPU time.
Oct 01 14:10:26 compute-0 systemd-machined[152704]: Machine qemu-7-instance-0000000b terminated.
Oct 01 14:10:26 compute-0 nova_compute[192698]: 2025-10-01 14:10:26.371 2 DEBUG nova.compute.manager [req-0004159b-180a-4a8e-a76f-a613eaae3be2 req-15bae1a9-4212-45d2-8f7f-204c2e3a7e75 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Received event network-vif-unplugged-c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:10:26 compute-0 nova_compute[192698]: 2025-10-01 14:10:26.373 2 DEBUG oslo_concurrency.lockutils [req-0004159b-180a-4a8e-a76f-a613eaae3be2 req-15bae1a9-4212-45d2-8f7f-204c2e3a7e75 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "ff58d640-84a3-4709-9a4a-084f3deaac0c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:10:26 compute-0 nova_compute[192698]: 2025-10-01 14:10:26.373 2 DEBUG oslo_concurrency.lockutils [req-0004159b-180a-4a8e-a76f-a613eaae3be2 req-15bae1a9-4212-45d2-8f7f-204c2e3a7e75 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "ff58d640-84a3-4709-9a4a-084f3deaac0c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:10:26 compute-0 nova_compute[192698]: 2025-10-01 14:10:26.373 2 DEBUG oslo_concurrency.lockutils [req-0004159b-180a-4a8e-a76f-a613eaae3be2 req-15bae1a9-4212-45d2-8f7f-204c2e3a7e75 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "ff58d640-84a3-4709-9a4a-084f3deaac0c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:10:26 compute-0 nova_compute[192698]: 2025-10-01 14:10:26.374 2 DEBUG nova.compute.manager [req-0004159b-180a-4a8e-a76f-a613eaae3be2 req-15bae1a9-4212-45d2-8f7f-204c2e3a7e75 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] No waiting events found dispatching network-vif-unplugged-c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:10:26 compute-0 nova_compute[192698]: 2025-10-01 14:10:26.374 2 DEBUG nova.compute.manager [req-0004159b-180a-4a8e-a76f-a613eaae3be2 req-15bae1a9-4212-45d2-8f7f-204c2e3a7e75 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Received event network-vif-unplugged-c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 01 14:10:26 compute-0 neutron-haproxy-ovnmeta-becc357a-665d-42a0-9440-5383962ecf85[218572]: [NOTICE]   (218576) : haproxy version is 3.0.5-8e879a5
Oct 01 14:10:26 compute-0 neutron-haproxy-ovnmeta-becc357a-665d-42a0-9440-5383962ecf85[218572]: [NOTICE]   (218576) : path to executable is /usr/sbin/haproxy
Oct 01 14:10:26 compute-0 neutron-haproxy-ovnmeta-becc357a-665d-42a0-9440-5383962ecf85[218572]: [WARNING]  (218576) : Exiting Master process...
Oct 01 14:10:26 compute-0 podman[218853]: 2025-10-01 14:10:26.408864964 +0000 UTC m=+0.053231853 container kill d3e66bc6169a83a80bfce3f4d8574352b82fb5d9d36be8f58882ba58598c48ff (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-becc357a-665d-42a0-9440-5383962ecf85, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Oct 01 14:10:26 compute-0 neutron-haproxy-ovnmeta-becc357a-665d-42a0-9440-5383962ecf85[218572]: [ALERT]    (218576) : Current worker (218578) exited with code 143 (Terminated)
Oct 01 14:10:26 compute-0 neutron-haproxy-ovnmeta-becc357a-665d-42a0-9440-5383962ecf85[218572]: [WARNING]  (218576) : All workers exited. Exiting... (0)
Oct 01 14:10:26 compute-0 systemd[1]: libpod-d3e66bc6169a83a80bfce3f4d8574352b82fb5d9d36be8f58882ba58598c48ff.scope: Deactivated successfully.
Oct 01 14:10:26 compute-0 conmon[218572]: conmon d3e66bc6169a83a80bfc <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-d3e66bc6169a83a80bfce3f4d8574352b82fb5d9d36be8f58882ba58598c48ff.scope/container/memory.events
Oct 01 14:10:26 compute-0 nova_compute[192698]: 2025-10-01 14:10:26.462 2 DEBUG nova.virt.libvirt.driver [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Migrate API has completed _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11182
Oct 01 14:10:26 compute-0 nova_compute[192698]: 2025-10-01 14:10:26.463 2 DEBUG nova.virt.libvirt.driver [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Migration operation thread has finished _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11230
Oct 01 14:10:26 compute-0 nova_compute[192698]: 2025-10-01 14:10:26.463 2 DEBUG nova.virt.libvirt.driver [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Oct 01 14:10:26 compute-0 podman[218880]: 2025-10-01 14:10:26.469694111 +0000 UTC m=+0.024839749 container died d3e66bc6169a83a80bfce3f4d8574352b82fb5d9d36be8f58882ba58598c48ff (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-becc357a-665d-42a0-9440-5383962ecf85, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Oct 01 14:10:26 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d3e66bc6169a83a80bfce3f4d8574352b82fb5d9d36be8f58882ba58598c48ff-userdata-shm.mount: Deactivated successfully.
Oct 01 14:10:26 compute-0 systemd[1]: var-lib-containers-storage-overlay-1683e9ed7d9ef4a00f6de6000b1a8a8179acacd83f6d26b9201be1b2885f0a80-merged.mount: Deactivated successfully.
Oct 01 14:10:26 compute-0 podman[218880]: 2025-10-01 14:10:26.515517664 +0000 UTC m=+0.070663282 container cleanup d3e66bc6169a83a80bfce3f4d8574352b82fb5d9d36be8f58882ba58598c48ff (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-becc357a-665d-42a0-9440-5383962ecf85, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0)
Oct 01 14:10:26 compute-0 systemd[1]: libpod-conmon-d3e66bc6169a83a80bfce3f4d8574352b82fb5d9d36be8f58882ba58598c48ff.scope: Deactivated successfully.
Oct 01 14:10:26 compute-0 podman[218879]: 2025-10-01 14:10:26.536766286 +0000 UTC m=+0.079607453 container remove d3e66bc6169a83a80bfce3f4d8574352b82fb5d9d36be8f58882ba58598c48ff (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-becc357a-665d-42a0-9440-5383962ecf85, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team)
Oct 01 14:10:26 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:10:26.555 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[0dc30207-02b4-4264-b5bc-b935d54ec9d3]: (4, ("Wed Oct  1 02:10:26 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-becc357a-665d-42a0-9440-5383962ecf85 (d3e66bc6169a83a80bfce3f4d8574352b82fb5d9d36be8f58882ba58598c48ff)\nd3e66bc6169a83a80bfce3f4d8574352b82fb5d9d36be8f58882ba58598c48ff\nWed Oct  1 02:10:26 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-becc357a-665d-42a0-9440-5383962ecf85 (d3e66bc6169a83a80bfce3f4d8574352b82fb5d9d36be8f58882ba58598c48ff)\nd3e66bc6169a83a80bfce3f4d8574352b82fb5d9d36be8f58882ba58598c48ff\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:10:26 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:10:26.557 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[a6858ad0-95b6-4557-b4e9-21f29c569d97]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:10:26 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:10:26.558 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/becc357a-665d-42a0-9440-5383962ecf85.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/becc357a-665d-42a0-9440-5383962ecf85.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:10:26 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:10:26.558 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[937e0532-1d5a-43fa-bf7a-e08a1a745892]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:10:26 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:10:26.559 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbecc357a-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:10:26 compute-0 nova_compute[192698]: 2025-10-01 14:10:26.597 2 DEBUG nova.virt.libvirt.guest [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid 'ff58d640-84a3-4709-9a4a-084f3deaac0c' (instance-0000000b) get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Oct 01 14:10:26 compute-0 nova_compute[192698]: 2025-10-01 14:10:26.598 2 INFO nova.virt.libvirt.driver [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Migration operation has completed
Oct 01 14:10:26 compute-0 nova_compute[192698]: 2025-10-01 14:10:26.599 2 INFO nova.compute.manager [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] _post_live_migration() is started..
Oct 01 14:10:26 compute-0 nova_compute[192698]: 2025-10-01 14:10:26.607 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:10:26 compute-0 kernel: tapbecc357a-60: left promiscuous mode
Oct 01 14:10:26 compute-0 nova_compute[192698]: 2025-10-01 14:10:26.610 2 WARNING neutronclient.v2_0.client [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:10:26 compute-0 nova_compute[192698]: 2025-10-01 14:10:26.612 2 WARNING neutronclient.v2_0.client [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:10:26 compute-0 nova_compute[192698]: 2025-10-01 14:10:26.625 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:10:26 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:10:26.629 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[6ae609b4-5879-43fb-b7ba-4a2214d14edf]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:10:26 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:10:26.648 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[67f470e0-4c9f-47c4-93a0-ab1695d119b1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:10:26 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:10:26.651 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[3748a41e-af0c-447e-82bb-4159fcee9a07]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:10:26 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:10:26.676 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[40e16fd2-54e2-49d0-bf90-3670b083bf00]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 413698, 'reachable_time': 43156, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218916, 'error': None, 'target': 'ovnmeta-becc357a-665d-42a0-9440-5383962ecf85', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:10:26 compute-0 systemd[1]: run-netns-ovnmeta\x2dbecc357a\x2d665d\x2d42a0\x2d9440\x2d5383962ecf85.mount: Deactivated successfully.
Oct 01 14:10:26 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:10:26.681 103910 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-becc357a-665d-42a0-9440-5383962ecf85 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Oct 01 14:10:26 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:10:26.681 103910 DEBUG oslo.privsep.daemon [-] privsep: reply[07e68c1f-bc99-43be-b0b6-ae0bdc7bad76]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:10:27 compute-0 nova_compute[192698]: 2025-10-01 14:10:27.341 2 DEBUG nova.compute.manager [req-07f84323-052b-42aa-b956-e2989e5e3343 req-23e26ca2-7ba3-48ad-accf-853dc38d39e0 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Received event network-vif-unplugged-c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:10:27 compute-0 nova_compute[192698]: 2025-10-01 14:10:27.341 2 DEBUG oslo_concurrency.lockutils [req-07f84323-052b-42aa-b956-e2989e5e3343 req-23e26ca2-7ba3-48ad-accf-853dc38d39e0 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "ff58d640-84a3-4709-9a4a-084f3deaac0c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:10:27 compute-0 nova_compute[192698]: 2025-10-01 14:10:27.342 2 DEBUG oslo_concurrency.lockutils [req-07f84323-052b-42aa-b956-e2989e5e3343 req-23e26ca2-7ba3-48ad-accf-853dc38d39e0 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "ff58d640-84a3-4709-9a4a-084f3deaac0c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:10:27 compute-0 nova_compute[192698]: 2025-10-01 14:10:27.342 2 DEBUG oslo_concurrency.lockutils [req-07f84323-052b-42aa-b956-e2989e5e3343 req-23e26ca2-7ba3-48ad-accf-853dc38d39e0 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "ff58d640-84a3-4709-9a4a-084f3deaac0c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:10:27 compute-0 nova_compute[192698]: 2025-10-01 14:10:27.342 2 DEBUG nova.compute.manager [req-07f84323-052b-42aa-b956-e2989e5e3343 req-23e26ca2-7ba3-48ad-accf-853dc38d39e0 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] No waiting events found dispatching network-vif-unplugged-c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:10:27 compute-0 nova_compute[192698]: 2025-10-01 14:10:27.343 2 DEBUG nova.compute.manager [req-07f84323-052b-42aa-b956-e2989e5e3343 req-23e26ca2-7ba3-48ad-accf-853dc38d39e0 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Received event network-vif-unplugged-c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 01 14:10:27 compute-0 nova_compute[192698]: 2025-10-01 14:10:27.453 2 DEBUG nova.network.neutron [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Activated binding for port c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Oct 01 14:10:27 compute-0 nova_compute[192698]: 2025-10-01 14:10:27.454 2 DEBUG nova.compute.manager [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4", "address": "fa:16:3e:47:d6:1f", "network": {"id": "becc357a-665d-42a0-9440-5383962ecf85", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-159886170-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "066bb0cdf38a41b786fd15af0a2c834e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8f94db4-b8", "ovs_interfaceid": "c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10059
Oct 01 14:10:27 compute-0 nova_compute[192698]: 2025-10-01 14:10:27.455 2 DEBUG nova.virt.libvirt.vif [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-10-01T14:09:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-1583961141',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-1583961141',id=11,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-01T14:09:38Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='6bf51d775b7c4c15a0326680d214c2bd',ramdisk_id='',reservation_id='r-xzdsyszl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteBasicStrategy-716451052',owner_user_name='tempest-TestExecuteBasicStrategy-716451052-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-01T14:10:05Z,user_data=None,user_id='fc564881007a4754ade24ed65141e269',uuid=ff58d640-84a3-4709-9a4a-084f3deaac0c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4", "address": "fa:16:3e:47:d6:1f", "network": {"id": "becc357a-665d-42a0-9440-5383962ecf85", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-159886170-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "066bb0cdf38a41b786fd15af0a2c834e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8f94db4-b8", "ovs_interfaceid": "c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 01 14:10:27 compute-0 nova_compute[192698]: 2025-10-01 14:10:27.455 2 DEBUG nova.network.os_vif_util [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Converting VIF {"id": "c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4", "address": "fa:16:3e:47:d6:1f", "network": {"id": "becc357a-665d-42a0-9440-5383962ecf85", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-159886170-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "066bb0cdf38a41b786fd15af0a2c834e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc8f94db4-b8", "ovs_interfaceid": "c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:10:27 compute-0 nova_compute[192698]: 2025-10-01 14:10:27.456 2 DEBUG nova.network.os_vif_util [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:47:d6:1f,bridge_name='br-int',has_traffic_filtering=True,id=c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4,network=Network(becc357a-665d-42a0-9440-5383962ecf85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8f94db4-b8') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:10:27 compute-0 nova_compute[192698]: 2025-10-01 14:10:27.456 2 DEBUG os_vif [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:47:d6:1f,bridge_name='br-int',has_traffic_filtering=True,id=c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4,network=Network(becc357a-665d-42a0-9440-5383962ecf85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8f94db4-b8') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 01 14:10:27 compute-0 nova_compute[192698]: 2025-10-01 14:10:27.457 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:10:27 compute-0 nova_compute[192698]: 2025-10-01 14:10:27.458 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc8f94db4-b8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:10:27 compute-0 nova_compute[192698]: 2025-10-01 14:10:27.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:10:27 compute-0 nova_compute[192698]: 2025-10-01 14:10:27.461 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:10:27 compute-0 nova_compute[192698]: 2025-10-01 14:10:27.462 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:10:27 compute-0 nova_compute[192698]: 2025-10-01 14:10:27.462 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=c68abe7a-4b30-417b-aea5-3258cbe9594f) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:10:27 compute-0 nova_compute[192698]: 2025-10-01 14:10:27.463 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:10:27 compute-0 nova_compute[192698]: 2025-10-01 14:10:27.464 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:10:27 compute-0 nova_compute[192698]: 2025-10-01 14:10:27.471 2 INFO os_vif [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:47:d6:1f,bridge_name='br-int',has_traffic_filtering=True,id=c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4,network=Network(becc357a-665d-42a0-9440-5383962ecf85),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc8f94db4-b8')
Oct 01 14:10:27 compute-0 nova_compute[192698]: 2025-10-01 14:10:27.472 2 DEBUG oslo_concurrency.lockutils [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:10:27 compute-0 nova_compute[192698]: 2025-10-01 14:10:27.472 2 DEBUG oslo_concurrency.lockutils [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:10:27 compute-0 nova_compute[192698]: 2025-10-01 14:10:27.473 2 DEBUG oslo_concurrency.lockutils [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:10:27 compute-0 nova_compute[192698]: 2025-10-01 14:10:27.473 2 DEBUG nova.compute.manager [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10082
Oct 01 14:10:27 compute-0 nova_compute[192698]: 2025-10-01 14:10:27.474 2 INFO nova.virt.libvirt.driver [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Deleting instance files /var/lib/nova/instances/ff58d640-84a3-4709-9a4a-084f3deaac0c_del
Oct 01 14:10:27 compute-0 nova_compute[192698]: 2025-10-01 14:10:27.475 2 INFO nova.virt.libvirt.driver [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Deletion of /var/lib/nova/instances/ff58d640-84a3-4709-9a4a-084f3deaac0c_del complete
Oct 01 14:10:28 compute-0 nova_compute[192698]: 2025-10-01 14:10:28.183 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:10:28 compute-0 nova_compute[192698]: 2025-10-01 14:10:28.448 2 DEBUG nova.compute.manager [req-48ca29be-c31b-42dc-afab-55afcafd5559 req-89ddbfb3-220a-4d54-879d-b92b390f0eff 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Received event network-vif-plugged-c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:10:28 compute-0 nova_compute[192698]: 2025-10-01 14:10:28.449 2 DEBUG oslo_concurrency.lockutils [req-48ca29be-c31b-42dc-afab-55afcafd5559 req-89ddbfb3-220a-4d54-879d-b92b390f0eff 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "ff58d640-84a3-4709-9a4a-084f3deaac0c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:10:28 compute-0 nova_compute[192698]: 2025-10-01 14:10:28.449 2 DEBUG oslo_concurrency.lockutils [req-48ca29be-c31b-42dc-afab-55afcafd5559 req-89ddbfb3-220a-4d54-879d-b92b390f0eff 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "ff58d640-84a3-4709-9a4a-084f3deaac0c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:10:28 compute-0 nova_compute[192698]: 2025-10-01 14:10:28.449 2 DEBUG oslo_concurrency.lockutils [req-48ca29be-c31b-42dc-afab-55afcafd5559 req-89ddbfb3-220a-4d54-879d-b92b390f0eff 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "ff58d640-84a3-4709-9a4a-084f3deaac0c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:10:28 compute-0 nova_compute[192698]: 2025-10-01 14:10:28.450 2 DEBUG nova.compute.manager [req-48ca29be-c31b-42dc-afab-55afcafd5559 req-89ddbfb3-220a-4d54-879d-b92b390f0eff 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] No waiting events found dispatching network-vif-plugged-c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:10:28 compute-0 nova_compute[192698]: 2025-10-01 14:10:28.450 2 WARNING nova.compute.manager [req-48ca29be-c31b-42dc-afab-55afcafd5559 req-89ddbfb3-220a-4d54-879d-b92b390f0eff 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Received unexpected event network-vif-plugged-c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4 for instance with vm_state active and task_state migrating.
Oct 01 14:10:28 compute-0 nova_compute[192698]: 2025-10-01 14:10:28.450 2 DEBUG nova.compute.manager [req-48ca29be-c31b-42dc-afab-55afcafd5559 req-89ddbfb3-220a-4d54-879d-b92b390f0eff 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Received event network-vif-unplugged-c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:10:28 compute-0 nova_compute[192698]: 2025-10-01 14:10:28.450 2 DEBUG oslo_concurrency.lockutils [req-48ca29be-c31b-42dc-afab-55afcafd5559 req-89ddbfb3-220a-4d54-879d-b92b390f0eff 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "ff58d640-84a3-4709-9a4a-084f3deaac0c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:10:28 compute-0 nova_compute[192698]: 2025-10-01 14:10:28.450 2 DEBUG oslo_concurrency.lockutils [req-48ca29be-c31b-42dc-afab-55afcafd5559 req-89ddbfb3-220a-4d54-879d-b92b390f0eff 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "ff58d640-84a3-4709-9a4a-084f3deaac0c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:10:28 compute-0 nova_compute[192698]: 2025-10-01 14:10:28.451 2 DEBUG oslo_concurrency.lockutils [req-48ca29be-c31b-42dc-afab-55afcafd5559 req-89ddbfb3-220a-4d54-879d-b92b390f0eff 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "ff58d640-84a3-4709-9a4a-084f3deaac0c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:10:28 compute-0 nova_compute[192698]: 2025-10-01 14:10:28.451 2 DEBUG nova.compute.manager [req-48ca29be-c31b-42dc-afab-55afcafd5559 req-89ddbfb3-220a-4d54-879d-b92b390f0eff 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] No waiting events found dispatching network-vif-unplugged-c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:10:28 compute-0 nova_compute[192698]: 2025-10-01 14:10:28.451 2 DEBUG nova.compute.manager [req-48ca29be-c31b-42dc-afab-55afcafd5559 req-89ddbfb3-220a-4d54-879d-b92b390f0eff 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Received event network-vif-unplugged-c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 01 14:10:28 compute-0 nova_compute[192698]: 2025-10-01 14:10:28.451 2 DEBUG nova.compute.manager [req-48ca29be-c31b-42dc-afab-55afcafd5559 req-89ddbfb3-220a-4d54-879d-b92b390f0eff 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Received event network-vif-plugged-c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:10:28 compute-0 nova_compute[192698]: 2025-10-01 14:10:28.451 2 DEBUG oslo_concurrency.lockutils [req-48ca29be-c31b-42dc-afab-55afcafd5559 req-89ddbfb3-220a-4d54-879d-b92b390f0eff 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "ff58d640-84a3-4709-9a4a-084f3deaac0c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:10:28 compute-0 nova_compute[192698]: 2025-10-01 14:10:28.452 2 DEBUG oslo_concurrency.lockutils [req-48ca29be-c31b-42dc-afab-55afcafd5559 req-89ddbfb3-220a-4d54-879d-b92b390f0eff 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "ff58d640-84a3-4709-9a4a-084f3deaac0c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:10:28 compute-0 nova_compute[192698]: 2025-10-01 14:10:28.452 2 DEBUG oslo_concurrency.lockutils [req-48ca29be-c31b-42dc-afab-55afcafd5559 req-89ddbfb3-220a-4d54-879d-b92b390f0eff 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "ff58d640-84a3-4709-9a4a-084f3deaac0c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:10:28 compute-0 nova_compute[192698]: 2025-10-01 14:10:28.452 2 DEBUG nova.compute.manager [req-48ca29be-c31b-42dc-afab-55afcafd5559 req-89ddbfb3-220a-4d54-879d-b92b390f0eff 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] No waiting events found dispatching network-vif-plugged-c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:10:28 compute-0 nova_compute[192698]: 2025-10-01 14:10:28.452 2 WARNING nova.compute.manager [req-48ca29be-c31b-42dc-afab-55afcafd5559 req-89ddbfb3-220a-4d54-879d-b92b390f0eff 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Received unexpected event network-vif-plugged-c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4 for instance with vm_state active and task_state migrating.
Oct 01 14:10:28 compute-0 nova_compute[192698]: 2025-10-01 14:10:28.452 2 DEBUG nova.compute.manager [req-48ca29be-c31b-42dc-afab-55afcafd5559 req-89ddbfb3-220a-4d54-879d-b92b390f0eff 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Received event network-vif-plugged-c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:10:28 compute-0 nova_compute[192698]: 2025-10-01 14:10:28.453 2 DEBUG oslo_concurrency.lockutils [req-48ca29be-c31b-42dc-afab-55afcafd5559 req-89ddbfb3-220a-4d54-879d-b92b390f0eff 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "ff58d640-84a3-4709-9a4a-084f3deaac0c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:10:28 compute-0 nova_compute[192698]: 2025-10-01 14:10:28.453 2 DEBUG oslo_concurrency.lockutils [req-48ca29be-c31b-42dc-afab-55afcafd5559 req-89ddbfb3-220a-4d54-879d-b92b390f0eff 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "ff58d640-84a3-4709-9a4a-084f3deaac0c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:10:28 compute-0 nova_compute[192698]: 2025-10-01 14:10:28.453 2 DEBUG oslo_concurrency.lockutils [req-48ca29be-c31b-42dc-afab-55afcafd5559 req-89ddbfb3-220a-4d54-879d-b92b390f0eff 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "ff58d640-84a3-4709-9a4a-084f3deaac0c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:10:28 compute-0 nova_compute[192698]: 2025-10-01 14:10:28.453 2 DEBUG nova.compute.manager [req-48ca29be-c31b-42dc-afab-55afcafd5559 req-89ddbfb3-220a-4d54-879d-b92b390f0eff 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] No waiting events found dispatching network-vif-plugged-c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:10:28 compute-0 nova_compute[192698]: 2025-10-01 14:10:28.453 2 WARNING nova.compute.manager [req-48ca29be-c31b-42dc-afab-55afcafd5559 req-89ddbfb3-220a-4d54-879d-b92b390f0eff 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Received unexpected event network-vif-plugged-c8f94db4-b8dc-4c4f-a2c7-62bdaccb2cb4 for instance with vm_state active and task_state migrating.
Oct 01 14:10:29 compute-0 podman[203144]: time="2025-10-01T14:10:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:10:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:10:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:10:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:10:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3015 "" "Go-http-client/1.1"
Oct 01 14:10:30 compute-0 podman[218917]: 2025-10-01 14:10:30.194405717 +0000 UTC m=+0.100547638 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., release=1755695350, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vendor=Red Hat, Inc., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.openshift.expose-services=, io.buildah.version=1.33.7)
Oct 01 14:10:31 compute-0 openstack_network_exporter[205307]: ERROR   14:10:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:10:31 compute-0 openstack_network_exporter[205307]: ERROR   14:10:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:10:31 compute-0 openstack_network_exporter[205307]: ERROR   14:10:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:10:31 compute-0 openstack_network_exporter[205307]: ERROR   14:10:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:10:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:10:31 compute-0 openstack_network_exporter[205307]: ERROR   14:10:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:10:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:10:32 compute-0 nova_compute[192698]: 2025-10-01 14:10:32.464 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:10:33 compute-0 nova_compute[192698]: 2025-10-01 14:10:33.185 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:10:36 compute-0 nova_compute[192698]: 2025-10-01 14:10:36.015 2 DEBUG oslo_concurrency.lockutils [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "ff58d640-84a3-4709-9a4a-084f3deaac0c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:10:36 compute-0 nova_compute[192698]: 2025-10-01 14:10:36.015 2 DEBUG oslo_concurrency.lockutils [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "ff58d640-84a3-4709-9a4a-084f3deaac0c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:10:36 compute-0 nova_compute[192698]: 2025-10-01 14:10:36.015 2 DEBUG oslo_concurrency.lockutils [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "ff58d640-84a3-4709-9a4a-084f3deaac0c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:10:36 compute-0 podman[218939]: 2025-10-01 14:10:36.175477375 +0000 UTC m=+0.085954794 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20250930, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, container_name=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct 01 14:10:36 compute-0 podman[218940]: 2025-10-01 14:10:36.190340675 +0000 UTC m=+0.100098914 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_id=multipathd, org.label-schema.build-date=20250930)
Oct 01 14:10:36 compute-0 nova_compute[192698]: 2025-10-01 14:10:36.530 2 DEBUG oslo_concurrency.lockutils [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:10:36 compute-0 nova_compute[192698]: 2025-10-01 14:10:36.530 2 DEBUG oslo_concurrency.lockutils [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:10:36 compute-0 nova_compute[192698]: 2025-10-01 14:10:36.531 2 DEBUG oslo_concurrency.lockutils [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:10:36 compute-0 nova_compute[192698]: 2025-10-01 14:10:36.531 2 DEBUG nova.compute.resource_tracker [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 01 14:10:36 compute-0 nova_compute[192698]: 2025-10-01 14:10:36.781 2 WARNING nova.virt.libvirt.driver [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:10:36 compute-0 nova_compute[192698]: 2025-10-01 14:10:36.783 2 DEBUG oslo_concurrency.processutils [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:10:36 compute-0 nova_compute[192698]: 2025-10-01 14:10:36.818 2 DEBUG oslo_concurrency.processutils [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.035s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:10:36 compute-0 nova_compute[192698]: 2025-10-01 14:10:36.819 2 DEBUG nova.compute.resource_tracker [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5835MB free_disk=73.30538940429688GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 01 14:10:36 compute-0 nova_compute[192698]: 2025-10-01 14:10:36.820 2 DEBUG oslo_concurrency.lockutils [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:10:36 compute-0 nova_compute[192698]: 2025-10-01 14:10:36.821 2 DEBUG oslo_concurrency.lockutils [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:10:37 compute-0 nova_compute[192698]: 2025-10-01 14:10:37.467 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:10:37 compute-0 nova_compute[192698]: 2025-10-01 14:10:37.842 2 DEBUG nova.compute.resource_tracker [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Migration for instance ff58d640-84a3-4709-9a4a-084f3deaac0c refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Oct 01 14:10:38 compute-0 nova_compute[192698]: 2025-10-01 14:10:38.187 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:10:38 compute-0 nova_compute[192698]: 2025-10-01 14:10:38.350 2 DEBUG nova.compute.resource_tracker [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Oct 01 14:10:38 compute-0 nova_compute[192698]: 2025-10-01 14:10:38.390 2 DEBUG nova.compute.resource_tracker [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Migration 95d4b164-ecb5-485b-b5eb-37e5147e6ed1 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Oct 01 14:10:38 compute-0 nova_compute[192698]: 2025-10-01 14:10:38.391 2 DEBUG nova.compute.resource_tracker [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 01 14:10:38 compute-0 nova_compute[192698]: 2025-10-01 14:10:38.391 2 DEBUG nova.compute.resource_tracker [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:10:36 up  1:09,  0 user,  load average: 0.12, 0.30, 0.44\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 01 14:10:38 compute-0 nova_compute[192698]: 2025-10-01 14:10:38.434 2 DEBUG nova.compute.provider_tree [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:10:38 compute-0 nova_compute[192698]: 2025-10-01 14:10:38.945 2 DEBUG nova.scheduler.client.report [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:10:39 compute-0 nova_compute[192698]: 2025-10-01 14:10:39.458 2 DEBUG nova.compute.resource_tracker [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 01 14:10:39 compute-0 nova_compute[192698]: 2025-10-01 14:10:39.459 2 DEBUG oslo_concurrency.lockutils [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.638s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:10:39 compute-0 nova_compute[192698]: 2025-10-01 14:10:39.484 2 INFO nova.compute.manager [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Oct 01 14:10:40 compute-0 nova_compute[192698]: 2025-10-01 14:10:40.557 2 INFO nova.scheduler.client.report [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Deleted allocation for migration 95d4b164-ecb5-485b-b5eb-37e5147e6ed1
Oct 01 14:10:40 compute-0 nova_compute[192698]: 2025-10-01 14:10:40.557 2 DEBUG nova.virt.libvirt.driver [None req-90cf24fc-3bd8-4cf1-a063-5a20060f2f46 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ff58d640-84a3-4709-9a4a-084f3deaac0c] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Oct 01 14:10:42 compute-0 podman[218980]: 2025-10-01 14:10:42.193911601 +0000 UTC m=+0.097327780 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 01 14:10:42 compute-0 nova_compute[192698]: 2025-10-01 14:10:42.469 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:10:43 compute-0 nova_compute[192698]: 2025-10-01 14:10:43.190 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:10:47 compute-0 nova_compute[192698]: 2025-10-01 14:10:47.472 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:10:48 compute-0 nova_compute[192698]: 2025-10-01 14:10:48.193 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:10:52 compute-0 nova_compute[192698]: 2025-10-01 14:10:52.474 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:10:52 compute-0 nova_compute[192698]: 2025-10-01 14:10:52.734 2 DEBUG nova.compute.manager [None req-8de041ca-2af2-449a-b1ad-da20e59301fb 1b0ba8d8c771490ab1005529976fdb7e 9dacac6049d34f02846f752af09ae16f - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider ee1e54f5-453b-4949-a499-9a192f03b8f0 in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:631
Oct 01 14:10:52 compute-0 nova_compute[192698]: 2025-10-01 14:10:52.793 2 DEBUG nova.compute.provider_tree [None req-8de041ca-2af2-449a-b1ad-da20e59301fb 1b0ba8d8c771490ab1005529976fdb7e 9dacac6049d34f02846f752af09ae16f - - default default] Updating resource provider ee1e54f5-453b-4949-a499-9a192f03b8f0 generation from 11 to 14 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Oct 01 14:10:53 compute-0 podman[219003]: 2025-10-01 14:10:53.191393523 +0000 UTC m=+0.087007563 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Oct 01 14:10:53 compute-0 nova_compute[192698]: 2025-10-01 14:10:53.194 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:10:53 compute-0 podman[219004]: 2025-10-01 14:10:53.219467568 +0000 UTC m=+0.110970407 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller)
Oct 01 14:10:57 compute-0 nova_compute[192698]: 2025-10-01 14:10:57.476 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:10:58 compute-0 nova_compute[192698]: 2025-10-01 14:10:58.195 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:10:58 compute-0 nova_compute[192698]: 2025-10-01 14:10:58.300 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:10:59 compute-0 podman[203144]: time="2025-10-01T14:10:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:10:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:10:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:10:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:10:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3017 "" "Go-http-client/1.1"
Oct 01 14:11:01 compute-0 podman[219048]: 2025-10-01 14:11:01.203069225 +0000 UTC m=+0.110283409 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, version=9.6, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, config_id=edpm)
Oct 01 14:11:01 compute-0 openstack_network_exporter[205307]: ERROR   14:11:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:11:01 compute-0 openstack_network_exporter[205307]: ERROR   14:11:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:11:01 compute-0 openstack_network_exporter[205307]: ERROR   14:11:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:11:01 compute-0 openstack_network_exporter[205307]: ERROR   14:11:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:11:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:11:01 compute-0 openstack_network_exporter[205307]: ERROR   14:11:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:11:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:11:02 compute-0 nova_compute[192698]: 2025-10-01 14:11:02.479 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:11:03 compute-0 nova_compute[192698]: 2025-10-01 14:11:03.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:11:07 compute-0 podman[219071]: 2025-10-01 14:11:07.176090949 +0000 UTC m=+0.084967308 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, container_name=iscsid, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Oct 01 14:11:07 compute-0 podman[219072]: 2025-10-01 14:11:07.195206443 +0000 UTC m=+0.093120977 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20250930)
Oct 01 14:11:07 compute-0 nova_compute[192698]: 2025-10-01 14:11:07.483 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:11:08 compute-0 nova_compute[192698]: 2025-10-01 14:11:08.201 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:11:08 compute-0 nova_compute[192698]: 2025-10-01 14:11:08.926 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:11:09 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:11:09.006 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f9:06:4a 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-37b393ef-9800-44f4-9d0c-5619ab6bca84', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-37b393ef-9800-44f4-9d0c-5619ab6bca84', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4d522a996598416ea6ea51c72d7cfbf1', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8541fa8e-c7c4-4255-a26d-fb804b6e0d2c, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=e4f6bc57-dde0-452b-a798-220aa628ff67) old=Port_Binding(mac=['fa:16:3e:f9:06:4a'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-37b393ef-9800-44f4-9d0c-5619ab6bca84', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-37b393ef-9800-44f4-9d0c-5619ab6bca84', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4d522a996598416ea6ea51c72d7cfbf1', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:11:09 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:11:09.007 103791 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port e4f6bc57-dde0-452b-a798-220aa628ff67 in datapath 37b393ef-9800-44f4-9d0c-5619ab6bca84 updated
Oct 01 14:11:09 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:11:09.008 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 37b393ef-9800-44f4-9d0c-5619ab6bca84, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 01 14:11:09 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:11:09.009 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[d630aa6c-511a-4ac7-a5b3-0ef835f94739]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:11:11 compute-0 nova_compute[192698]: 2025-10-01 14:11:11.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:11:12 compute-0 nova_compute[192698]: 2025-10-01 14:11:12.438 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:11:12 compute-0 nova_compute[192698]: 2025-10-01 14:11:12.439 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:11:12 compute-0 nova_compute[192698]: 2025-10-01 14:11:12.439 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:11:12 compute-0 nova_compute[192698]: 2025-10-01 14:11:12.439 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 01 14:11:12 compute-0 nova_compute[192698]: 2025-10-01 14:11:12.484 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:11:12 compute-0 nova_compute[192698]: 2025-10-01 14:11:12.647 2 WARNING nova.virt.libvirt.driver [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:11:12 compute-0 nova_compute[192698]: 2025-10-01 14:11:12.649 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:11:12 compute-0 nova_compute[192698]: 2025-10-01 14:11:12.682 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.032s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:11:12 compute-0 nova_compute[192698]: 2025-10-01 14:11:12.683 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5829MB free_disk=73.3054084777832GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 01 14:11:12 compute-0 nova_compute[192698]: 2025-10-01 14:11:12.683 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:11:12 compute-0 nova_compute[192698]: 2025-10-01 14:11:12.684 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:11:13 compute-0 podman[219110]: 2025-10-01 14:11:13.202875809 +0000 UTC m=+0.115445097 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 01 14:11:13 compute-0 nova_compute[192698]: 2025-10-01 14:11:13.204 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:11:13 compute-0 nova_compute[192698]: 2025-10-01 14:11:13.732 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 01 14:11:13 compute-0 nova_compute[192698]: 2025-10-01 14:11:13.733 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:11:12 up  1:10,  0 user,  load average: 0.06, 0.27, 0.42\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 01 14:11:13 compute-0 nova_compute[192698]: 2025-10-01 14:11:13.769 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:11:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:11:14.249 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:11:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:11:14.249 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:11:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:11:14.250 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:11:14 compute-0 nova_compute[192698]: 2025-10-01 14:11:14.277 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:11:14 compute-0 nova_compute[192698]: 2025-10-01 14:11:14.789 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 01 14:11:14 compute-0 nova_compute[192698]: 2025-10-01 14:11:14.790 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.106s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:11:15 compute-0 nova_compute[192698]: 2025-10-01 14:11:15.790 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:11:15 compute-0 nova_compute[192698]: 2025-10-01 14:11:15.792 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:11:15 compute-0 nova_compute[192698]: 2025-10-01 14:11:15.924 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:11:15 compute-0 nova_compute[192698]: 2025-10-01 14:11:15.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:11:16 compute-0 nova_compute[192698]: 2025-10-01 14:11:16.926 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:11:17 compute-0 nova_compute[192698]: 2025-10-01 14:11:17.486 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:11:18 compute-0 nova_compute[192698]: 2025-10-01 14:11:18.207 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:11:18 compute-0 nova_compute[192698]: 2025-10-01 14:11:18.915 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:11:20 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:11:20.180 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a7:6b:49 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-41c06b88-819f-4308-8966-c22841a21d40', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-41c06b88-819f-4308-8966-c22841a21d40', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'be9ba3892ca64853879f2eaa6c510492', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9ed54ae0-4bb4-4770-80ea-7d9ba18f0b8e, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=4035aa73-60ed-49e2-98ca-89bb9ec90782) old=Port_Binding(mac=['fa:16:3e:a7:6b:49'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-41c06b88-819f-4308-8966-c22841a21d40', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-41c06b88-819f-4308-8966-c22841a21d40', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'be9ba3892ca64853879f2eaa6c510492', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:11:20 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:11:20.182 103791 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 4035aa73-60ed-49e2-98ca-89bb9ec90782 in datapath 41c06b88-819f-4308-8966-c22841a21d40 updated
Oct 01 14:11:20 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:11:20.184 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 41c06b88-819f-4308-8966-c22841a21d40, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 01 14:11:20 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:11:20.185 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[2df912dd-a5b8-43fa-838d-8d62eca242da]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:11:20 compute-0 nova_compute[192698]: 2025-10-01 14:11:20.924 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:11:20 compute-0 nova_compute[192698]: 2025-10-01 14:11:20.925 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 01 14:11:22 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:11:22.062 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'e2:3f:3c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:1d:a6:67:ed:e6'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:11:22 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:11:22.063 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 01 14:11:22 compute-0 nova_compute[192698]: 2025-10-01 14:11:22.065 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:11:22 compute-0 nova_compute[192698]: 2025-10-01 14:11:22.488 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:11:23 compute-0 nova_compute[192698]: 2025-10-01 14:11:23.211 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:11:24 compute-0 podman[219138]: 2025-10-01 14:11:24.160922651 +0000 UTC m=+0.070391165 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Oct 01 14:11:24 compute-0 podman[219139]: 2025-10-01 14:11:24.207666099 +0000 UTC m=+0.118203892 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 01 14:11:26 compute-0 sshd-session[219180]: Connection closed by 14.103.205.40 port 41746 [preauth]
Oct 01 14:11:27 compute-0 nova_compute[192698]: 2025-10-01 14:11:27.490 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:11:28 compute-0 nova_compute[192698]: 2025-10-01 14:11:28.213 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:11:29 compute-0 podman[203144]: time="2025-10-01T14:11:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:11:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:11:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:11:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:11:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3016 "" "Go-http-client/1.1"
Oct 01 14:11:31 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:11:31.066 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=10cf9814-09fa-4bad-879a-270f9b64eda3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:11:31 compute-0 openstack_network_exporter[205307]: ERROR   14:11:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:11:31 compute-0 openstack_network_exporter[205307]: ERROR   14:11:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:11:31 compute-0 openstack_network_exporter[205307]: ERROR   14:11:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:11:31 compute-0 openstack_network_exporter[205307]: ERROR   14:11:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:11:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:11:31 compute-0 openstack_network_exporter[205307]: ERROR   14:11:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:11:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:11:32 compute-0 podman[219182]: 2025-10-01 14:11:32.185444716 +0000 UTC m=+0.083821907 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, version=9.6, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, maintainer=Red Hat, Inc., architecture=x86_64)
Oct 01 14:11:32 compute-0 nova_compute[192698]: 2025-10-01 14:11:32.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:11:33 compute-0 ovn_controller[94909]: 2025-10-01T14:11:33Z|00097|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Oct 01 14:11:33 compute-0 nova_compute[192698]: 2025-10-01 14:11:33.216 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:11:37 compute-0 nova_compute[192698]: 2025-10-01 14:11:37.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:11:38 compute-0 podman[219204]: 2025-10-01 14:11:38.195479296 +0000 UTC m=+0.095873280 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, container_name=iscsid, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930)
Oct 01 14:11:38 compute-0 podman[219205]: 2025-10-01 14:11:38.199673499 +0000 UTC m=+0.093859596 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 01 14:11:38 compute-0 nova_compute[192698]: 2025-10-01 14:11:38.219 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:11:42 compute-0 nova_compute[192698]: 2025-10-01 14:11:42.499 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:11:43 compute-0 nova_compute[192698]: 2025-10-01 14:11:43.221 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:11:44 compute-0 podman[219246]: 2025-10-01 14:11:44.168939668 +0000 UTC m=+0.079891223 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 01 14:11:47 compute-0 nova_compute[192698]: 2025-10-01 14:11:47.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:11:48 compute-0 nova_compute[192698]: 2025-10-01 14:11:48.224 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:11:52 compute-0 nova_compute[192698]: 2025-10-01 14:11:52.548 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:11:53 compute-0 nova_compute[192698]: 2025-10-01 14:11:53.226 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:11:55 compute-0 podman[219271]: 2025-10-01 14:11:55.170440385 +0000 UTC m=+0.079959914 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Oct 01 14:11:55 compute-0 podman[219272]: 2025-10-01 14:11:55.202946514 +0000 UTC m=+0.108145166 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 01 14:11:57 compute-0 nova_compute[192698]: 2025-10-01 14:11:57.550 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:11:58 compute-0 nova_compute[192698]: 2025-10-01 14:11:58.228 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:11:59 compute-0 podman[203144]: time="2025-10-01T14:11:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:11:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:11:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:11:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:11:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3022 "" "Go-http-client/1.1"
Oct 01 14:12:01 compute-0 openstack_network_exporter[205307]: ERROR   14:12:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:12:01 compute-0 openstack_network_exporter[205307]: ERROR   14:12:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:12:01 compute-0 openstack_network_exporter[205307]: ERROR   14:12:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:12:01 compute-0 openstack_network_exporter[205307]: ERROR   14:12:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:12:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:12:01 compute-0 openstack_network_exporter[205307]: ERROR   14:12:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:12:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:12:02 compute-0 nova_compute[192698]: 2025-10-01 14:12:02.556 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:12:03 compute-0 podman[219315]: 2025-10-01 14:12:03.169401589 +0000 UTC m=+0.079229185 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, distribution-scope=public, maintainer=Red Hat, Inc., name=ubi9-minimal, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, config_id=edpm)
Oct 01 14:12:03 compute-0 nova_compute[192698]: 2025-10-01 14:12:03.232 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:12:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:12:03.407 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f4:ed:77 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-8562f9c0-0a2b-4e53-975b-dd543293c802', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8562f9c0-0a2b-4e53-975b-dd543293c802', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8120df3906db49b8ac8fa624e2f2aad4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=18a05557-2e37-4ffc-9c62-b55a7756059d, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=b5ee4d88-5d32-4dfa-ae97-c0c0976243b5) old=Port_Binding(mac=['fa:16:3e:f4:ed:77'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-8562f9c0-0a2b-4e53-975b-dd543293c802', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8562f9c0-0a2b-4e53-975b-dd543293c802', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8120df3906db49b8ac8fa624e2f2aad4', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:12:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:12:03.408 103791 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port b5ee4d88-5d32-4dfa-ae97-c0c0976243b5 in datapath 8562f9c0-0a2b-4e53-975b-dd543293c802 updated
Oct 01 14:12:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:12:03.410 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8562f9c0-0a2b-4e53-975b-dd543293c802, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 01 14:12:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:12:03.411 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[0336a555-d0d5-4418-a1f2-3a1ad77ea9ed]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:12:07 compute-0 nova_compute[192698]: 2025-10-01 14:12:07.560 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:12:08 compute-0 nova_compute[192698]: 2025-10-01 14:12:08.235 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:12:09 compute-0 podman[219337]: 2025-10-01 14:12:09.191940797 +0000 UTC m=+0.096887502 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true)
Oct 01 14:12:09 compute-0 podman[219336]: 2025-10-01 14:12:09.193840128 +0000 UTC m=+0.101938199 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, config_id=iscsid, org.label-schema.vendor=CentOS, container_name=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true)
Oct 01 14:12:09 compute-0 nova_compute[192698]: 2025-10-01 14:12:09.927 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:12:11 compute-0 nova_compute[192698]: 2025-10-01 14:12:11.926 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:12:12 compute-0 nova_compute[192698]: 2025-10-01 14:12:12.446 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:12:12 compute-0 nova_compute[192698]: 2025-10-01 14:12:12.447 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:12:12 compute-0 nova_compute[192698]: 2025-10-01 14:12:12.448 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:12:12 compute-0 nova_compute[192698]: 2025-10-01 14:12:12.448 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 01 14:12:12 compute-0 nova_compute[192698]: 2025-10-01 14:12:12.563 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:12:12 compute-0 nova_compute[192698]: 2025-10-01 14:12:12.693 2 WARNING nova.virt.libvirt.driver [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:12:12 compute-0 nova_compute[192698]: 2025-10-01 14:12:12.695 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:12:12 compute-0 nova_compute[192698]: 2025-10-01 14:12:12.730 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.035s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:12:12 compute-0 nova_compute[192698]: 2025-10-01 14:12:12.732 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5826MB free_disk=73.3054084777832GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 01 14:12:12 compute-0 nova_compute[192698]: 2025-10-01 14:12:12.732 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:12:12 compute-0 nova_compute[192698]: 2025-10-01 14:12:12.733 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:12:13 compute-0 nova_compute[192698]: 2025-10-01 14:12:13.239 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:12:13 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:12:13.259 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7c:56:b5 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-3abff5a0-ca68-452f-bb65-9e095293554b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3abff5a0-ca68-452f-bb65-9e095293554b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9f5565c36a294928af6bcd073bff4643', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=22a43b2c-f159-450e-9328-aa56ee42934e, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=1059b09b-48ec-48eb-ab41-55e7519289b4) old=Port_Binding(mac=['fa:16:3e:7c:56:b5'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-3abff5a0-ca68-452f-bb65-9e095293554b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3abff5a0-ca68-452f-bb65-9e095293554b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9f5565c36a294928af6bcd073bff4643', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:12:13 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:12:13.260 103791 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 1059b09b-48ec-48eb-ab41-55e7519289b4 in datapath 3abff5a0-ca68-452f-bb65-9e095293554b updated
Oct 01 14:12:13 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:12:13.261 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3abff5a0-ca68-452f-bb65-9e095293554b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 01 14:12:13 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:12:13.262 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[f116105d-2264-427a-b51f-f9dfd07078da]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:12:13 compute-0 nova_compute[192698]: 2025-10-01 14:12:13.790 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 01 14:12:13 compute-0 nova_compute[192698]: 2025-10-01 14:12:13.791 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:12:12 up  1:11,  0 user,  load average: 0.02, 0.22, 0.40\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 01 14:12:13 compute-0 nova_compute[192698]: 2025-10-01 14:12:13.818 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:12:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:12:14.251 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:12:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:12:14.251 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:12:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:12:14.252 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:12:14 compute-0 nova_compute[192698]: 2025-10-01 14:12:14.326 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:12:14 compute-0 nova_compute[192698]: 2025-10-01 14:12:14.837 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 01 14:12:14 compute-0 nova_compute[192698]: 2025-10-01 14:12:14.837 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.104s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:12:15 compute-0 podman[219379]: 2025-10-01 14:12:15.175855958 +0000 UTC m=+0.084440986 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 01 14:12:15 compute-0 nova_compute[192698]: 2025-10-01 14:12:15.837 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:12:15 compute-0 nova_compute[192698]: 2025-10-01 14:12:15.838 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:12:15 compute-0 nova_compute[192698]: 2025-10-01 14:12:15.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:12:17 compute-0 nova_compute[192698]: 2025-10-01 14:12:17.566 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:12:17 compute-0 nova_compute[192698]: 2025-10-01 14:12:17.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:12:18 compute-0 nova_compute[192698]: 2025-10-01 14:12:18.240 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:12:18 compute-0 nova_compute[192698]: 2025-10-01 14:12:18.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:12:22 compute-0 nova_compute[192698]: 2025-10-01 14:12:22.567 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:12:22 compute-0 nova_compute[192698]: 2025-10-01 14:12:22.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:12:22 compute-0 nova_compute[192698]: 2025-10-01 14:12:22.925 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 01 14:12:23 compute-0 nova_compute[192698]: 2025-10-01 14:12:23.244 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:12:26 compute-0 podman[219403]: 2025-10-01 14:12:26.138446671 +0000 UTC m=+0.058301069 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2)
Oct 01 14:12:26 compute-0 podman[219404]: 2025-10-01 14:12:26.240012779 +0000 UTC m=+0.149874976 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, config_id=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 01 14:12:27 compute-0 nova_compute[192698]: 2025-10-01 14:12:27.568 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:12:28 compute-0 nova_compute[192698]: 2025-10-01 14:12:28.244 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:12:28 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:12:28.459 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'e2:3f:3c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:1d:a6:67:ed:e6'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:12:28 compute-0 nova_compute[192698]: 2025-10-01 14:12:28.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:12:28 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:12:28.460 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 01 14:12:29 compute-0 podman[203144]: time="2025-10-01T14:12:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:12:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:12:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:12:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:12:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3017 "" "Go-http-client/1.1"
Oct 01 14:12:31 compute-0 openstack_network_exporter[205307]: ERROR   14:12:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:12:31 compute-0 openstack_network_exporter[205307]: ERROR   14:12:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:12:31 compute-0 openstack_network_exporter[205307]: ERROR   14:12:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:12:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:12:31 compute-0 openstack_network_exporter[205307]: ERROR   14:12:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:12:31 compute-0 openstack_network_exporter[205307]: ERROR   14:12:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:12:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:12:31 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:12:31.462 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=10cf9814-09fa-4bad-879a-270f9b64eda3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:12:32 compute-0 nova_compute[192698]: 2025-10-01 14:12:32.571 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:12:33 compute-0 nova_compute[192698]: 2025-10-01 14:12:33.247 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:12:34 compute-0 podman[219450]: 2025-10-01 14:12:34.186162747 +0000 UTC m=+0.094146869 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, release=1755695350, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., config_id=edpm, version=9.6)
Oct 01 14:12:37 compute-0 nova_compute[192698]: 2025-10-01 14:12:37.573 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:12:38 compute-0 nova_compute[192698]: 2025-10-01 14:12:38.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:12:40 compute-0 podman[219473]: 2025-10-01 14:12:40.194018865 +0000 UTC m=+0.094960970 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct 01 14:12:40 compute-0 podman[219472]: 2025-10-01 14:12:40.206485822 +0000 UTC m=+0.114724024 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, container_name=iscsid, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 01 14:12:42 compute-0 nova_compute[192698]: 2025-10-01 14:12:42.575 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:12:43 compute-0 nova_compute[192698]: 2025-10-01 14:12:43.253 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:12:45 compute-0 nova_compute[192698]: 2025-10-01 14:12:45.283 2 DEBUG oslo_concurrency.lockutils [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Acquiring lock "9678ec54-31c4-4d96-a3f0-96686482f8b8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:12:45 compute-0 nova_compute[192698]: 2025-10-01 14:12:45.284 2 DEBUG oslo_concurrency.lockutils [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Lock "9678ec54-31c4-4d96-a3f0-96686482f8b8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:12:45 compute-0 podman[219511]: 2025-10-01 14:12:45.405396074 +0000 UTC m=+0.087923980 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 01 14:12:45 compute-0 nova_compute[192698]: 2025-10-01 14:12:45.789 2 DEBUG nova.compute.manager [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Oct 01 14:12:46 compute-0 nova_compute[192698]: 2025-10-01 14:12:46.351 2 DEBUG oslo_concurrency.lockutils [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:12:46 compute-0 nova_compute[192698]: 2025-10-01 14:12:46.352 2 DEBUG oslo_concurrency.lockutils [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:12:46 compute-0 nova_compute[192698]: 2025-10-01 14:12:46.362 2 DEBUG nova.virt.hardware [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Oct 01 14:12:46 compute-0 nova_compute[192698]: 2025-10-01 14:12:46.362 2 INFO nova.compute.claims [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] Claim successful on node compute-0.ctlplane.example.com
Oct 01 14:12:47 compute-0 nova_compute[192698]: 2025-10-01 14:12:47.426 2 DEBUG nova.compute.provider_tree [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:12:47 compute-0 nova_compute[192698]: 2025-10-01 14:12:47.578 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:12:47 compute-0 nova_compute[192698]: 2025-10-01 14:12:47.934 2 DEBUG nova.scheduler.client.report [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:12:48 compute-0 nova_compute[192698]: 2025-10-01 14:12:48.254 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:12:48 compute-0 nova_compute[192698]: 2025-10-01 14:12:48.445 2 DEBUG oslo_concurrency.lockutils [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.094s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:12:48 compute-0 nova_compute[192698]: 2025-10-01 14:12:48.446 2 DEBUG nova.compute.manager [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Oct 01 14:12:48 compute-0 nova_compute[192698]: 2025-10-01 14:12:48.959 2 DEBUG nova.compute.manager [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Oct 01 14:12:48 compute-0 nova_compute[192698]: 2025-10-01 14:12:48.960 2 DEBUG nova.network.neutron [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Oct 01 14:12:48 compute-0 nova_compute[192698]: 2025-10-01 14:12:48.960 2 WARNING neutronclient.v2_0.client [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:12:48 compute-0 nova_compute[192698]: 2025-10-01 14:12:48.961 2 WARNING neutronclient.v2_0.client [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:12:49 compute-0 nova_compute[192698]: 2025-10-01 14:12:49.470 2 INFO nova.virt.libvirt.driver [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 01 14:12:49 compute-0 nova_compute[192698]: 2025-10-01 14:12:49.979 2 DEBUG nova.compute.manager [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Oct 01 14:12:50 compute-0 nova_compute[192698]: 2025-10-01 14:12:50.344 2 DEBUG nova.network.neutron [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] Successfully created port: 1f41468b-a36d-4ea0-bf4c-26b26778c31d _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Oct 01 14:12:51 compute-0 nova_compute[192698]: 2025-10-01 14:12:51.004 2 DEBUG nova.compute.manager [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Oct 01 14:12:51 compute-0 nova_compute[192698]: 2025-10-01 14:12:51.006 2 DEBUG nova.virt.libvirt.driver [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Oct 01 14:12:51 compute-0 nova_compute[192698]: 2025-10-01 14:12:51.007 2 INFO nova.virt.libvirt.driver [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] Creating image(s)
Oct 01 14:12:51 compute-0 nova_compute[192698]: 2025-10-01 14:12:51.008 2 DEBUG oslo_concurrency.lockutils [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Acquiring lock "/var/lib/nova/instances/9678ec54-31c4-4d96-a3f0-96686482f8b8/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:12:51 compute-0 nova_compute[192698]: 2025-10-01 14:12:51.008 2 DEBUG oslo_concurrency.lockutils [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Lock "/var/lib/nova/instances/9678ec54-31c4-4d96-a3f0-96686482f8b8/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:12:51 compute-0 nova_compute[192698]: 2025-10-01 14:12:51.009 2 DEBUG oslo_concurrency.lockutils [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Lock "/var/lib/nova/instances/9678ec54-31c4-4d96-a3f0-96686482f8b8/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:12:51 compute-0 nova_compute[192698]: 2025-10-01 14:12:51.010 2 DEBUG oslo_utils.imageutils.format_inspector [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:12:51 compute-0 nova_compute[192698]: 2025-10-01 14:12:51.016 2 DEBUG oslo_utils.imageutils.format_inspector [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:12:51 compute-0 nova_compute[192698]: 2025-10-01 14:12:51.019 2 DEBUG oslo_concurrency.processutils [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:12:51 compute-0 nova_compute[192698]: 2025-10-01 14:12:51.113 2 DEBUG oslo_concurrency.processutils [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:12:51 compute-0 nova_compute[192698]: 2025-10-01 14:12:51.116 2 DEBUG oslo_concurrency.lockutils [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Acquiring lock "f477473ce09fdc00484ca839f539813eb2fee546" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:12:51 compute-0 nova_compute[192698]: 2025-10-01 14:12:51.118 2 DEBUG oslo_concurrency.lockutils [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Lock "f477473ce09fdc00484ca839f539813eb2fee546" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:12:51 compute-0 nova_compute[192698]: 2025-10-01 14:12:51.119 2 DEBUG oslo_utils.imageutils.format_inspector [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:12:51 compute-0 nova_compute[192698]: 2025-10-01 14:12:51.128 2 DEBUG oslo_utils.imageutils.format_inspector [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:12:51 compute-0 nova_compute[192698]: 2025-10-01 14:12:51.129 2 DEBUG oslo_concurrency.processutils [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:12:51 compute-0 nova_compute[192698]: 2025-10-01 14:12:51.217 2 DEBUG oslo_concurrency.processutils [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:12:51 compute-0 nova_compute[192698]: 2025-10-01 14:12:51.219 2 DEBUG oslo_concurrency.processutils [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546,backing_fmt=raw /var/lib/nova/instances/9678ec54-31c4-4d96-a3f0-96686482f8b8/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:12:51 compute-0 nova_compute[192698]: 2025-10-01 14:12:51.262 2 DEBUG oslo_concurrency.processutils [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546,backing_fmt=raw /var/lib/nova/instances/9678ec54-31c4-4d96-a3f0-96686482f8b8/disk 1073741824" returned: 0 in 0.043s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:12:51 compute-0 nova_compute[192698]: 2025-10-01 14:12:51.264 2 DEBUG oslo_concurrency.lockutils [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Lock "f477473ce09fdc00484ca839f539813eb2fee546" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.146s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:12:51 compute-0 nova_compute[192698]: 2025-10-01 14:12:51.265 2 DEBUG oslo_concurrency.processutils [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:12:51 compute-0 nova_compute[192698]: 2025-10-01 14:12:51.359 2 DEBUG oslo_concurrency.processutils [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:12:51 compute-0 nova_compute[192698]: 2025-10-01 14:12:51.361 2 DEBUG nova.virt.disk.api [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Checking if we can resize image /var/lib/nova/instances/9678ec54-31c4-4d96-a3f0-96686482f8b8/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 01 14:12:51 compute-0 nova_compute[192698]: 2025-10-01 14:12:51.362 2 DEBUG oslo_concurrency.processutils [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9678ec54-31c4-4d96-a3f0-96686482f8b8/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:12:51 compute-0 nova_compute[192698]: 2025-10-01 14:12:51.436 2 DEBUG oslo_concurrency.processutils [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9678ec54-31c4-4d96-a3f0-96686482f8b8/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:12:51 compute-0 nova_compute[192698]: 2025-10-01 14:12:51.438 2 DEBUG nova.virt.disk.api [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Cannot resize image /var/lib/nova/instances/9678ec54-31c4-4d96-a3f0-96686482f8b8/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 01 14:12:51 compute-0 nova_compute[192698]: 2025-10-01 14:12:51.438 2 DEBUG nova.virt.libvirt.driver [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Oct 01 14:12:51 compute-0 nova_compute[192698]: 2025-10-01 14:12:51.440 2 DEBUG nova.virt.libvirt.driver [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] Ensure instance console log exists: /var/lib/nova/instances/9678ec54-31c4-4d96-a3f0-96686482f8b8/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Oct 01 14:12:51 compute-0 nova_compute[192698]: 2025-10-01 14:12:51.440 2 DEBUG oslo_concurrency.lockutils [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:12:51 compute-0 nova_compute[192698]: 2025-10-01 14:12:51.442 2 DEBUG oslo_concurrency.lockutils [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:12:51 compute-0 nova_compute[192698]: 2025-10-01 14:12:51.442 2 DEBUG oslo_concurrency.lockutils [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:12:52 compute-0 nova_compute[192698]: 2025-10-01 14:12:52.456 2 DEBUG nova.network.neutron [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] Successfully updated port: 1f41468b-a36d-4ea0-bf4c-26b26778c31d _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Oct 01 14:12:52 compute-0 nova_compute[192698]: 2025-10-01 14:12:52.510 2 DEBUG nova.compute.manager [req-9902134a-49de-4c03-8e77-dff9029dc73b req-58ac3013-c06e-483a-9061-dc410c223639 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] Received event network-changed-1f41468b-a36d-4ea0-bf4c-26b26778c31d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:12:52 compute-0 nova_compute[192698]: 2025-10-01 14:12:52.511 2 DEBUG nova.compute.manager [req-9902134a-49de-4c03-8e77-dff9029dc73b req-58ac3013-c06e-483a-9061-dc410c223639 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] Refreshing instance network info cache due to event network-changed-1f41468b-a36d-4ea0-bf4c-26b26778c31d. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Oct 01 14:12:52 compute-0 nova_compute[192698]: 2025-10-01 14:12:52.511 2 DEBUG oslo_concurrency.lockutils [req-9902134a-49de-4c03-8e77-dff9029dc73b req-58ac3013-c06e-483a-9061-dc410c223639 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "refresh_cache-9678ec54-31c4-4d96-a3f0-96686482f8b8" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 01 14:12:52 compute-0 nova_compute[192698]: 2025-10-01 14:12:52.511 2 DEBUG oslo_concurrency.lockutils [req-9902134a-49de-4c03-8e77-dff9029dc73b req-58ac3013-c06e-483a-9061-dc410c223639 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquired lock "refresh_cache-9678ec54-31c4-4d96-a3f0-96686482f8b8" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 01 14:12:52 compute-0 nova_compute[192698]: 2025-10-01 14:12:52.512 2 DEBUG nova.network.neutron [req-9902134a-49de-4c03-8e77-dff9029dc73b req-58ac3013-c06e-483a-9061-dc410c223639 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] Refreshing network info cache for port 1f41468b-a36d-4ea0-bf4c-26b26778c31d _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Oct 01 14:12:52 compute-0 nova_compute[192698]: 2025-10-01 14:12:52.580 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:12:52 compute-0 nova_compute[192698]: 2025-10-01 14:12:52.966 2 DEBUG oslo_concurrency.lockutils [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Acquiring lock "refresh_cache-9678ec54-31c4-4d96-a3f0-96686482f8b8" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 01 14:12:53 compute-0 nova_compute[192698]: 2025-10-01 14:12:53.019 2 WARNING neutronclient.v2_0.client [req-9902134a-49de-4c03-8e77-dff9029dc73b req-58ac3013-c06e-483a-9061-dc410c223639 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:12:53 compute-0 nova_compute[192698]: 2025-10-01 14:12:53.139 2 DEBUG nova.network.neutron [req-9902134a-49de-4c03-8e77-dff9029dc73b req-58ac3013-c06e-483a-9061-dc410c223639 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 01 14:12:53 compute-0 nova_compute[192698]: 2025-10-01 14:12:53.257 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:12:53 compute-0 nova_compute[192698]: 2025-10-01 14:12:53.312 2 DEBUG nova.network.neutron [req-9902134a-49de-4c03-8e77-dff9029dc73b req-58ac3013-c06e-483a-9061-dc410c223639 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:12:53 compute-0 nova_compute[192698]: 2025-10-01 14:12:53.820 2 DEBUG oslo_concurrency.lockutils [req-9902134a-49de-4c03-8e77-dff9029dc73b req-58ac3013-c06e-483a-9061-dc410c223639 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Releasing lock "refresh_cache-9678ec54-31c4-4d96-a3f0-96686482f8b8" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 01 14:12:53 compute-0 nova_compute[192698]: 2025-10-01 14:12:53.821 2 DEBUG oslo_concurrency.lockutils [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Acquired lock "refresh_cache-9678ec54-31c4-4d96-a3f0-96686482f8b8" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 01 14:12:53 compute-0 nova_compute[192698]: 2025-10-01 14:12:53.821 2 DEBUG nova.network.neutron [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 01 14:12:55 compute-0 nova_compute[192698]: 2025-10-01 14:12:55.143 2 DEBUG nova.network.neutron [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 01 14:12:55 compute-0 nova_compute[192698]: 2025-10-01 14:12:55.394 2 WARNING neutronclient.v2_0.client [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:12:56 compute-0 nova_compute[192698]: 2025-10-01 14:12:56.053 2 DEBUG nova.network.neutron [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] Updating instance_info_cache with network_info: [{"id": "1f41468b-a36d-4ea0-bf4c-26b26778c31d", "address": "fa:16:3e:df:81:60", "network": {"id": "8562f9c0-0a2b-4e53-975b-dd543293c802", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1048948457-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8120df3906db49b8ac8fa624e2f2aad4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f41468b-a3", "ovs_interfaceid": "1f41468b-a36d-4ea0-bf4c-26b26778c31d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:12:56 compute-0 nova_compute[192698]: 2025-10-01 14:12:56.568 2 DEBUG oslo_concurrency.lockutils [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Releasing lock "refresh_cache-9678ec54-31c4-4d96-a3f0-96686482f8b8" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 01 14:12:56 compute-0 nova_compute[192698]: 2025-10-01 14:12:56.569 2 DEBUG nova.compute.manager [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] Instance network_info: |[{"id": "1f41468b-a36d-4ea0-bf4c-26b26778c31d", "address": "fa:16:3e:df:81:60", "network": {"id": "8562f9c0-0a2b-4e53-975b-dd543293c802", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1048948457-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8120df3906db49b8ac8fa624e2f2aad4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f41468b-a3", "ovs_interfaceid": "1f41468b-a36d-4ea0-bf4c-26b26778c31d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Oct 01 14:12:56 compute-0 nova_compute[192698]: 2025-10-01 14:12:56.573 2 DEBUG nova.virt.libvirt.driver [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] Start _get_guest_xml network_info=[{"id": "1f41468b-a36d-4ea0-bf4c-26b26778c31d", "address": "fa:16:3e:df:81:60", "network": {"id": "8562f9c0-0a2b-4e53-975b-dd543293c802", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1048948457-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8120df3906db49b8ac8fa624e2f2aad4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f41468b-a3", "ovs_interfaceid": "1f41468b-a36d-4ea0-bf4c-26b26778c31d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-01T13:57:39Z,direct_url=<?>,disk_format='qcow2',id=48696e9b-a20d-4bf6-8ac2-6438fe748ab6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9dacac6049d34f02846f752af09ae16f',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-01T13:57:40Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encrypted': False, 'encryption_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_options': None, 'device_name': '/dev/vda', 'guest_format': None, 'image_id': '48696e9b-a20d-4bf6-8ac2-6438fe748ab6'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Oct 01 14:12:56 compute-0 nova_compute[192698]: 2025-10-01 14:12:56.580 2 WARNING nova.virt.libvirt.driver [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:12:56 compute-0 nova_compute[192698]: 2025-10-01 14:12:56.582 2 DEBUG nova.virt.driver [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='48696e9b-a20d-4bf6-8ac2-6438fe748ab6', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteHostMaintenanceStrategy-server-1450345243', uuid='9678ec54-31c4-4d96-a3f0-96686482f8b8'), owner=OwnerMeta(userid='8e4b771b5757444093151a3e38c0b2d7', username='tempest-TestExecuteHostMaintenanceStrategy-132658549-project-admin', projectid='9f5565c36a294928af6bcd073bff4643', projectname='tempest-TestExecuteHostMaintenanceStrategy-132658549'), image=ImageMeta(id='48696e9b-a20d-4bf6-8ac2-6438fe748ab6', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='69702c4b-38f2-49d1-96d5-85671652c67e', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "1f41468b-a36d-4ea0-bf4c-26b26778c31d", "address": "fa:16:3e:df:81:60", "network": {"id": "8562f9c0-0a2b-4e53-975b-dd543293c802", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1048948457-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8120df3906db49b8ac8fa624e2f2aad4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f41468b-a3", "ovs_interfaceid": "1f41468b-a36d-4ea0-bf4c-26b26778c31d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759327976.582181) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Oct 01 14:12:56 compute-0 nova_compute[192698]: 2025-10-01 14:12:56.587 2 DEBUG nova.virt.libvirt.host [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Oct 01 14:12:56 compute-0 nova_compute[192698]: 2025-10-01 14:12:56.588 2 DEBUG nova.virt.libvirt.host [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Oct 01 14:12:56 compute-0 nova_compute[192698]: 2025-10-01 14:12:56.594 2 DEBUG nova.virt.libvirt.host [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Oct 01 14:12:56 compute-0 nova_compute[192698]: 2025-10-01 14:12:56.595 2 DEBUG nova.virt.libvirt.host [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Oct 01 14:12:56 compute-0 nova_compute[192698]: 2025-10-01 14:12:56.595 2 DEBUG nova.virt.libvirt.driver [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Oct 01 14:12:56 compute-0 nova_compute[192698]: 2025-10-01 14:12:56.596 2 DEBUG nova.virt.hardware [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-01T13:57:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='69702c4b-38f2-49d1-96d5-85671652c67e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-01T13:57:39Z,direct_url=<?>,disk_format='qcow2',id=48696e9b-a20d-4bf6-8ac2-6438fe748ab6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9dacac6049d34f02846f752af09ae16f',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-01T13:57:40Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Oct 01 14:12:56 compute-0 nova_compute[192698]: 2025-10-01 14:12:56.597 2 DEBUG nova.virt.hardware [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Oct 01 14:12:56 compute-0 nova_compute[192698]: 2025-10-01 14:12:56.597 2 DEBUG nova.virt.hardware [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Oct 01 14:12:56 compute-0 nova_compute[192698]: 2025-10-01 14:12:56.598 2 DEBUG nova.virt.hardware [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Oct 01 14:12:56 compute-0 nova_compute[192698]: 2025-10-01 14:12:56.598 2 DEBUG nova.virt.hardware [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Oct 01 14:12:56 compute-0 nova_compute[192698]: 2025-10-01 14:12:56.598 2 DEBUG nova.virt.hardware [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Oct 01 14:12:56 compute-0 nova_compute[192698]: 2025-10-01 14:12:56.599 2 DEBUG nova.virt.hardware [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Oct 01 14:12:56 compute-0 nova_compute[192698]: 2025-10-01 14:12:56.600 2 DEBUG nova.virt.hardware [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Oct 01 14:12:56 compute-0 nova_compute[192698]: 2025-10-01 14:12:56.600 2 DEBUG nova.virt.hardware [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Oct 01 14:12:56 compute-0 nova_compute[192698]: 2025-10-01 14:12:56.600 2 DEBUG nova.virt.hardware [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Oct 01 14:12:56 compute-0 nova_compute[192698]: 2025-10-01 14:12:56.601 2 DEBUG nova.virt.hardware [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Oct 01 14:12:56 compute-0 nova_compute[192698]: 2025-10-01 14:12:56.608 2 DEBUG nova.virt.libvirt.vif [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-01T14:12:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1450345243',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1450345243',id=13,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9f5565c36a294928af6bcd073bff4643',ramdisk_id='',reservation_id='r-ifkg8pgx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-132658549',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-132658549-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-01T14:12:50Z,user_data=None,user_id='8e4b771b5757444093151a3e38c0b2d7',uuid=9678ec54-31c4-4d96-a3f0-96686482f8b8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1f41468b-a36d-4ea0-bf4c-26b26778c31d", "address": "fa:16:3e:df:81:60", "network": {"id": "8562f9c0-0a2b-4e53-975b-dd543293c802", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1048948457-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8120df3906db49b8ac8fa624e2f2aad4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f41468b-a3", "ovs_interfaceid": "1f41468b-a36d-4ea0-bf4c-26b26778c31d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Oct 01 14:12:56 compute-0 nova_compute[192698]: 2025-10-01 14:12:56.608 2 DEBUG nova.network.os_vif_util [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Converting VIF {"id": "1f41468b-a36d-4ea0-bf4c-26b26778c31d", "address": "fa:16:3e:df:81:60", "network": {"id": "8562f9c0-0a2b-4e53-975b-dd543293c802", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1048948457-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8120df3906db49b8ac8fa624e2f2aad4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f41468b-a3", "ovs_interfaceid": "1f41468b-a36d-4ea0-bf4c-26b26778c31d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:12:56 compute-0 nova_compute[192698]: 2025-10-01 14:12:56.610 2 DEBUG nova.network.os_vif_util [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:df:81:60,bridge_name='br-int',has_traffic_filtering=True,id=1f41468b-a36d-4ea0-bf4c-26b26778c31d,network=Network(8562f9c0-0a2b-4e53-975b-dd543293c802),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f41468b-a3') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:12:56 compute-0 nova_compute[192698]: 2025-10-01 14:12:56.612 2 DEBUG nova.objects.instance [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9678ec54-31c4-4d96-a3f0-96686482f8b8 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:12:57 compute-0 nova_compute[192698]: 2025-10-01 14:12:57.125 2 DEBUG nova.virt.libvirt.driver [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] End _get_guest_xml xml=<domain type="kvm">
Oct 01 14:12:57 compute-0 nova_compute[192698]:   <uuid>9678ec54-31c4-4d96-a3f0-96686482f8b8</uuid>
Oct 01 14:12:57 compute-0 nova_compute[192698]:   <name>instance-0000000d</name>
Oct 01 14:12:57 compute-0 nova_compute[192698]:   <memory>131072</memory>
Oct 01 14:12:57 compute-0 nova_compute[192698]:   <vcpu>1</vcpu>
Oct 01 14:12:57 compute-0 nova_compute[192698]:   <metadata>
Oct 01 14:12:57 compute-0 nova_compute[192698]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 01 14:12:57 compute-0 nova_compute[192698]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Oct 01 14:12:57 compute-0 nova_compute[192698]:       <nova:name>tempest-TestExecuteHostMaintenanceStrategy-server-1450345243</nova:name>
Oct 01 14:12:57 compute-0 nova_compute[192698]:       <nova:creationTime>2025-10-01 14:12:56</nova:creationTime>
Oct 01 14:12:57 compute-0 nova_compute[192698]:       <nova:flavor name="m1.nano" id="69702c4b-38f2-49d1-96d5-85671652c67e">
Oct 01 14:12:57 compute-0 nova_compute[192698]:         <nova:memory>128</nova:memory>
Oct 01 14:12:57 compute-0 nova_compute[192698]:         <nova:disk>1</nova:disk>
Oct 01 14:12:57 compute-0 nova_compute[192698]:         <nova:swap>0</nova:swap>
Oct 01 14:12:57 compute-0 nova_compute[192698]:         <nova:ephemeral>0</nova:ephemeral>
Oct 01 14:12:57 compute-0 nova_compute[192698]:         <nova:vcpus>1</nova:vcpus>
Oct 01 14:12:57 compute-0 nova_compute[192698]:         <nova:extraSpecs>
Oct 01 14:12:57 compute-0 nova_compute[192698]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Oct 01 14:12:57 compute-0 nova_compute[192698]:         </nova:extraSpecs>
Oct 01 14:12:57 compute-0 nova_compute[192698]:       </nova:flavor>
Oct 01 14:12:57 compute-0 nova_compute[192698]:       <nova:image uuid="48696e9b-a20d-4bf6-8ac2-6438fe748ab6">
Oct 01 14:12:57 compute-0 nova_compute[192698]:         <nova:containerFormat>bare</nova:containerFormat>
Oct 01 14:12:57 compute-0 nova_compute[192698]:         <nova:diskFormat>qcow2</nova:diskFormat>
Oct 01 14:12:57 compute-0 nova_compute[192698]:         <nova:minDisk>1</nova:minDisk>
Oct 01 14:12:57 compute-0 nova_compute[192698]:         <nova:minRam>0</nova:minRam>
Oct 01 14:12:57 compute-0 nova_compute[192698]:         <nova:properties>
Oct 01 14:12:57 compute-0 nova_compute[192698]:           <nova:property name="hw_rng_model">virtio</nova:property>
Oct 01 14:12:57 compute-0 nova_compute[192698]:         </nova:properties>
Oct 01 14:12:57 compute-0 nova_compute[192698]:       </nova:image>
Oct 01 14:12:57 compute-0 nova_compute[192698]:       <nova:owner>
Oct 01 14:12:57 compute-0 nova_compute[192698]:         <nova:user uuid="8e4b771b5757444093151a3e38c0b2d7">tempest-TestExecuteHostMaintenanceStrategy-132658549-project-admin</nova:user>
Oct 01 14:12:57 compute-0 nova_compute[192698]:         <nova:project uuid="9f5565c36a294928af6bcd073bff4643">tempest-TestExecuteHostMaintenanceStrategy-132658549</nova:project>
Oct 01 14:12:57 compute-0 nova_compute[192698]:       </nova:owner>
Oct 01 14:12:57 compute-0 nova_compute[192698]:       <nova:root type="image" uuid="48696e9b-a20d-4bf6-8ac2-6438fe748ab6"/>
Oct 01 14:12:57 compute-0 nova_compute[192698]:       <nova:ports>
Oct 01 14:12:57 compute-0 nova_compute[192698]:         <nova:port uuid="1f41468b-a36d-4ea0-bf4c-26b26778c31d">
Oct 01 14:12:57 compute-0 nova_compute[192698]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Oct 01 14:12:57 compute-0 nova_compute[192698]:         </nova:port>
Oct 01 14:12:57 compute-0 nova_compute[192698]:       </nova:ports>
Oct 01 14:12:57 compute-0 nova_compute[192698]:     </nova:instance>
Oct 01 14:12:57 compute-0 nova_compute[192698]:   </metadata>
Oct 01 14:12:57 compute-0 nova_compute[192698]:   <sysinfo type="smbios">
Oct 01 14:12:57 compute-0 nova_compute[192698]:     <system>
Oct 01 14:12:57 compute-0 nova_compute[192698]:       <entry name="manufacturer">RDO</entry>
Oct 01 14:12:57 compute-0 nova_compute[192698]:       <entry name="product">OpenStack Compute</entry>
Oct 01 14:12:57 compute-0 nova_compute[192698]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Oct 01 14:12:57 compute-0 nova_compute[192698]:       <entry name="serial">9678ec54-31c4-4d96-a3f0-96686482f8b8</entry>
Oct 01 14:12:57 compute-0 nova_compute[192698]:       <entry name="uuid">9678ec54-31c4-4d96-a3f0-96686482f8b8</entry>
Oct 01 14:12:57 compute-0 nova_compute[192698]:       <entry name="family">Virtual Machine</entry>
Oct 01 14:12:57 compute-0 nova_compute[192698]:     </system>
Oct 01 14:12:57 compute-0 nova_compute[192698]:   </sysinfo>
Oct 01 14:12:57 compute-0 nova_compute[192698]:   <os>
Oct 01 14:12:57 compute-0 nova_compute[192698]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 01 14:12:57 compute-0 nova_compute[192698]:     <boot dev="hd"/>
Oct 01 14:12:57 compute-0 nova_compute[192698]:     <smbios mode="sysinfo"/>
Oct 01 14:12:57 compute-0 nova_compute[192698]:   </os>
Oct 01 14:12:57 compute-0 nova_compute[192698]:   <features>
Oct 01 14:12:57 compute-0 nova_compute[192698]:     <acpi/>
Oct 01 14:12:57 compute-0 nova_compute[192698]:     <apic/>
Oct 01 14:12:57 compute-0 nova_compute[192698]:     <vmcoreinfo/>
Oct 01 14:12:57 compute-0 nova_compute[192698]:   </features>
Oct 01 14:12:57 compute-0 nova_compute[192698]:   <clock offset="utc">
Oct 01 14:12:57 compute-0 nova_compute[192698]:     <timer name="pit" tickpolicy="delay"/>
Oct 01 14:12:57 compute-0 nova_compute[192698]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 01 14:12:57 compute-0 nova_compute[192698]:     <timer name="hpet" present="no"/>
Oct 01 14:12:57 compute-0 nova_compute[192698]:   </clock>
Oct 01 14:12:57 compute-0 nova_compute[192698]:   <cpu mode="host-model" match="exact">
Oct 01 14:12:57 compute-0 nova_compute[192698]:     <topology sockets="1" cores="1" threads="1"/>
Oct 01 14:12:57 compute-0 nova_compute[192698]:   </cpu>
Oct 01 14:12:57 compute-0 nova_compute[192698]:   <devices>
Oct 01 14:12:57 compute-0 nova_compute[192698]:     <disk type="file" device="disk">
Oct 01 14:12:57 compute-0 nova_compute[192698]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 01 14:12:57 compute-0 nova_compute[192698]:       <source file="/var/lib/nova/instances/9678ec54-31c4-4d96-a3f0-96686482f8b8/disk"/>
Oct 01 14:12:57 compute-0 nova_compute[192698]:       <target dev="vda" bus="virtio"/>
Oct 01 14:12:57 compute-0 nova_compute[192698]:     </disk>
Oct 01 14:12:57 compute-0 nova_compute[192698]:     <disk type="file" device="cdrom">
Oct 01 14:12:57 compute-0 nova_compute[192698]:       <driver name="qemu" type="raw" cache="none"/>
Oct 01 14:12:57 compute-0 nova_compute[192698]:       <source file="/var/lib/nova/instances/9678ec54-31c4-4d96-a3f0-96686482f8b8/disk.config"/>
Oct 01 14:12:57 compute-0 nova_compute[192698]:       <target dev="sda" bus="sata"/>
Oct 01 14:12:57 compute-0 nova_compute[192698]:     </disk>
Oct 01 14:12:57 compute-0 nova_compute[192698]:     <interface type="ethernet">
Oct 01 14:12:57 compute-0 nova_compute[192698]:       <mac address="fa:16:3e:df:81:60"/>
Oct 01 14:12:57 compute-0 nova_compute[192698]:       <model type="virtio"/>
Oct 01 14:12:57 compute-0 nova_compute[192698]:       <driver name="vhost" rx_queue_size="512"/>
Oct 01 14:12:57 compute-0 nova_compute[192698]:       <mtu size="1442"/>
Oct 01 14:12:57 compute-0 nova_compute[192698]:       <target dev="tap1f41468b-a3"/>
Oct 01 14:12:57 compute-0 nova_compute[192698]:     </interface>
Oct 01 14:12:57 compute-0 nova_compute[192698]:     <serial type="pty">
Oct 01 14:12:57 compute-0 nova_compute[192698]:       <log file="/var/lib/nova/instances/9678ec54-31c4-4d96-a3f0-96686482f8b8/console.log" append="off"/>
Oct 01 14:12:57 compute-0 nova_compute[192698]:     </serial>
Oct 01 14:12:57 compute-0 nova_compute[192698]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 01 14:12:57 compute-0 nova_compute[192698]:     <video>
Oct 01 14:12:57 compute-0 nova_compute[192698]:       <model type="virtio"/>
Oct 01 14:12:57 compute-0 nova_compute[192698]:     </video>
Oct 01 14:12:57 compute-0 nova_compute[192698]:     <input type="tablet" bus="usb"/>
Oct 01 14:12:57 compute-0 nova_compute[192698]:     <rng model="virtio">
Oct 01 14:12:57 compute-0 nova_compute[192698]:       <backend model="random">/dev/urandom</backend>
Oct 01 14:12:57 compute-0 nova_compute[192698]:     </rng>
Oct 01 14:12:57 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root"/>
Oct 01 14:12:57 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:12:57 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:12:57 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:12:57 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:12:57 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:12:57 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:12:57 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:12:57 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:12:57 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:12:57 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:12:57 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:12:57 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:12:57 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:12:57 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:12:57 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:12:57 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:12:57 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:12:57 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:12:57 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:12:57 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:12:57 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:12:57 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:12:57 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:12:57 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:12:57 compute-0 nova_compute[192698]:     <controller type="usb" index="0"/>
Oct 01 14:12:57 compute-0 nova_compute[192698]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Oct 01 14:12:57 compute-0 nova_compute[192698]:       <stats period="10"/>
Oct 01 14:12:57 compute-0 nova_compute[192698]:     </memballoon>
Oct 01 14:12:57 compute-0 nova_compute[192698]:   </devices>
Oct 01 14:12:57 compute-0 nova_compute[192698]: </domain>
Oct 01 14:12:57 compute-0 nova_compute[192698]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Oct 01 14:12:57 compute-0 nova_compute[192698]: 2025-10-01 14:12:57.125 2 DEBUG nova.compute.manager [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] Preparing to wait for external event network-vif-plugged-1f41468b-a36d-4ea0-bf4c-26b26778c31d prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Oct 01 14:12:57 compute-0 nova_compute[192698]: 2025-10-01 14:12:57.126 2 DEBUG oslo_concurrency.lockutils [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Acquiring lock "9678ec54-31c4-4d96-a3f0-96686482f8b8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:12:57 compute-0 nova_compute[192698]: 2025-10-01 14:12:57.126 2 DEBUG oslo_concurrency.lockutils [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Lock "9678ec54-31c4-4d96-a3f0-96686482f8b8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:12:57 compute-0 nova_compute[192698]: 2025-10-01 14:12:57.126 2 DEBUG oslo_concurrency.lockutils [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Lock "9678ec54-31c4-4d96-a3f0-96686482f8b8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:12:57 compute-0 nova_compute[192698]: 2025-10-01 14:12:57.127 2 DEBUG nova.virt.libvirt.vif [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-01T14:12:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1450345243',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1450345243',id=13,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9f5565c36a294928af6bcd073bff4643',ramdisk_id='',reservation_id='r-ifkg8pgx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-132658549',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-132658549-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-01T14:12:50Z,user_data=None,user_id='8e4b771b5757444093151a3e38c0b2d7',uuid=9678ec54-31c4-4d96-a3f0-96686482f8b8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1f41468b-a36d-4ea0-bf4c-26b26778c31d", "address": "fa:16:3e:df:81:60", "network": {"id": "8562f9c0-0a2b-4e53-975b-dd543293c802", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1048948457-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8120df3906db49b8ac8fa624e2f2aad4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f41468b-a3", "ovs_interfaceid": "1f41468b-a36d-4ea0-bf4c-26b26778c31d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 01 14:12:57 compute-0 nova_compute[192698]: 2025-10-01 14:12:57.127 2 DEBUG nova.network.os_vif_util [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Converting VIF {"id": "1f41468b-a36d-4ea0-bf4c-26b26778c31d", "address": "fa:16:3e:df:81:60", "network": {"id": "8562f9c0-0a2b-4e53-975b-dd543293c802", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1048948457-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8120df3906db49b8ac8fa624e2f2aad4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f41468b-a3", "ovs_interfaceid": "1f41468b-a36d-4ea0-bf4c-26b26778c31d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:12:57 compute-0 nova_compute[192698]: 2025-10-01 14:12:57.128 2 DEBUG nova.network.os_vif_util [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:df:81:60,bridge_name='br-int',has_traffic_filtering=True,id=1f41468b-a36d-4ea0-bf4c-26b26778c31d,network=Network(8562f9c0-0a2b-4e53-975b-dd543293c802),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f41468b-a3') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:12:57 compute-0 nova_compute[192698]: 2025-10-01 14:12:57.128 2 DEBUG os_vif [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:df:81:60,bridge_name='br-int',has_traffic_filtering=True,id=1f41468b-a36d-4ea0-bf4c-26b26778c31d,network=Network(8562f9c0-0a2b-4e53-975b-dd543293c802),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f41468b-a3') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 01 14:12:57 compute-0 nova_compute[192698]: 2025-10-01 14:12:57.129 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:12:57 compute-0 nova_compute[192698]: 2025-10-01 14:12:57.129 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:12:57 compute-0 nova_compute[192698]: 2025-10-01 14:12:57.129 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:12:57 compute-0 nova_compute[192698]: 2025-10-01 14:12:57.130 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:12:57 compute-0 nova_compute[192698]: 2025-10-01 14:12:57.130 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'a6cafcb7-429a-5a82-8b8d-d6ac4e5c5975', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:12:57 compute-0 nova_compute[192698]: 2025-10-01 14:12:57.133 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:12:57 compute-0 nova_compute[192698]: 2025-10-01 14:12:57.134 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:12:57 compute-0 nova_compute[192698]: 2025-10-01 14:12:57.135 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:12:57 compute-0 nova_compute[192698]: 2025-10-01 14:12:57.136 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:12:57 compute-0 nova_compute[192698]: 2025-10-01 14:12:57.139 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:12:57 compute-0 nova_compute[192698]: 2025-10-01 14:12:57.139 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1f41468b-a3, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:12:57 compute-0 nova_compute[192698]: 2025-10-01 14:12:57.139 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap1f41468b-a3, col_values=(('qos', UUID('3c174cb2-350c-417f-a5cf-a196aafff863')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:12:57 compute-0 nova_compute[192698]: 2025-10-01 14:12:57.140 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap1f41468b-a3, col_values=(('external_ids', {'iface-id': '1f41468b-a36d-4ea0-bf4c-26b26778c31d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:df:81:60', 'vm-uuid': '9678ec54-31c4-4d96-a3f0-96686482f8b8'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:12:57 compute-0 nova_compute[192698]: 2025-10-01 14:12:57.141 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:12:57 compute-0 NetworkManager[51741]: <info>  [1759327977.1434] manager: (tap1f41468b-a3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/42)
Oct 01 14:12:57 compute-0 nova_compute[192698]: 2025-10-01 14:12:57.144 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:12:57 compute-0 nova_compute[192698]: 2025-10-01 14:12:57.156 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:12:57 compute-0 nova_compute[192698]: 2025-10-01 14:12:57.156 2 INFO os_vif [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:df:81:60,bridge_name='br-int',has_traffic_filtering=True,id=1f41468b-a36d-4ea0-bf4c-26b26778c31d,network=Network(8562f9c0-0a2b-4e53-975b-dd543293c802),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f41468b-a3')
Oct 01 14:12:57 compute-0 podman[219551]: 2025-10-01 14:12:57.218589634 +0000 UTC m=+0.122058633 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, tcib_managed=true, config_id=ovn_metadata_agent)
Oct 01 14:12:57 compute-0 podman[219552]: 2025-10-01 14:12:57.231171145 +0000 UTC m=+0.132744803 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250930)
Oct 01 14:12:58 compute-0 nova_compute[192698]: 2025-10-01 14:12:58.260 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:12:58 compute-0 nova_compute[192698]: 2025-10-01 14:12:58.700 2 DEBUG nova.virt.libvirt.driver [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 01 14:12:58 compute-0 nova_compute[192698]: 2025-10-01 14:12:58.701 2 DEBUG nova.virt.libvirt.driver [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 01 14:12:58 compute-0 nova_compute[192698]: 2025-10-01 14:12:58.701 2 DEBUG nova.virt.libvirt.driver [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] No VIF found with MAC fa:16:3e:df:81:60, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Oct 01 14:12:58 compute-0 nova_compute[192698]: 2025-10-01 14:12:58.702 2 INFO nova.virt.libvirt.driver [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] Using config drive
Oct 01 14:12:59 compute-0 nova_compute[192698]: 2025-10-01 14:12:59.218 2 WARNING neutronclient.v2_0.client [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:12:59 compute-0 podman[203144]: time="2025-10-01T14:12:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:12:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:12:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:12:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:12:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3021 "" "Go-http-client/1.1"
Oct 01 14:13:00 compute-0 nova_compute[192698]: 2025-10-01 14:13:00.238 2 INFO nova.virt.libvirt.driver [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] Creating config drive at /var/lib/nova/instances/9678ec54-31c4-4d96-a3f0-96686482f8b8/disk.config
Oct 01 14:13:00 compute-0 nova_compute[192698]: 2025-10-01 14:13:00.247 2 DEBUG oslo_concurrency.processutils [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9678ec54-31c4-4d96-a3f0-96686482f8b8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpaqmuf62q execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:13:00 compute-0 nova_compute[192698]: 2025-10-01 14:13:00.373 2 DEBUG oslo_concurrency.processutils [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9678ec54-31c4-4d96-a3f0-96686482f8b8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpaqmuf62q" returned: 0 in 0.125s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:13:00 compute-0 kernel: tap1f41468b-a3: entered promiscuous mode
Oct 01 14:13:00 compute-0 NetworkManager[51741]: <info>  [1759327980.4666] manager: (tap1f41468b-a3): new Tun device (/org/freedesktop/NetworkManager/Devices/43)
Oct 01 14:13:00 compute-0 ovn_controller[94909]: 2025-10-01T14:13:00Z|00098|binding|INFO|Claiming lport 1f41468b-a36d-4ea0-bf4c-26b26778c31d for this chassis.
Oct 01 14:13:00 compute-0 ovn_controller[94909]: 2025-10-01T14:13:00Z|00099|binding|INFO|1f41468b-a36d-4ea0-bf4c-26b26778c31d: Claiming fa:16:3e:df:81:60 10.100.0.5
Oct 01 14:13:00 compute-0 nova_compute[192698]: 2025-10-01 14:13:00.467 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:00 compute-0 nova_compute[192698]: 2025-10-01 14:13:00.476 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:00 compute-0 nova_compute[192698]: 2025-10-01 14:13:00.484 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:00.495 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:df:81:60 10.100.0.5'], port_security=['fa:16:3e:df:81:60 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '9678ec54-31c4-4d96-a3f0-96686482f8b8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8562f9c0-0a2b-4e53-975b-dd543293c802', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9f5565c36a294928af6bcd073bff4643', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd07d6cb5-684b-4a4b-83f2-c6fbca49c797', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=18a05557-2e37-4ffc-9c62-b55a7756059d, chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], logical_port=1f41468b-a36d-4ea0-bf4c-26b26778c31d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:00.497 103791 INFO neutron.agent.ovn.metadata.agent [-] Port 1f41468b-a36d-4ea0-bf4c-26b26778c31d in datapath 8562f9c0-0a2b-4e53-975b-dd543293c802 bound to our chassis
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:00.498 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8562f9c0-0a2b-4e53-975b-dd543293c802
Oct 01 14:13:00 compute-0 systemd-udevd[219614]: Network interface NamePolicy= disabled on kernel command line.
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:00.520 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[892da22d-b585-49dc-92d7-d0c59a0cfdc4]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:00.521 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8562f9c0-01 in ovnmeta-8562f9c0-0a2b-4e53-975b-dd543293c802 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:00.523 214114 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8562f9c0-00 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:00.524 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[c20ccc7c-21ff-44ff-9e07-987a0a2560e3]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:00.525 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[936e8963-a899-489d-99d4-a39518df6174]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:13:00 compute-0 systemd-machined[152704]: New machine qemu-8-instance-0000000d.
Oct 01 14:13:00 compute-0 NetworkManager[51741]: <info>  [1759327980.5443] device (tap1f41468b-a3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 01 14:13:00 compute-0 NetworkManager[51741]: <info>  [1759327980.5456] device (tap1f41468b-a3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:00.548 103910 DEBUG oslo.privsep.daemon [-] privsep: reply[94572a7b-8558-4e5d-afe7-8791a977debf]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:13:00 compute-0 nova_compute[192698]: 2025-10-01 14:13:00.563 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:00 compute-0 ovn_controller[94909]: 2025-10-01T14:13:00Z|00100|binding|INFO|Setting lport 1f41468b-a36d-4ea0-bf4c-26b26778c31d ovn-installed in OVS
Oct 01 14:13:00 compute-0 ovn_controller[94909]: 2025-10-01T14:13:00Z|00101|binding|INFO|Setting lport 1f41468b-a36d-4ea0-bf4c-26b26778c31d up in Southbound
Oct 01 14:13:00 compute-0 systemd[1]: Started Virtual Machine qemu-8-instance-0000000d.
Oct 01 14:13:00 compute-0 nova_compute[192698]: 2025-10-01 14:13:00.568 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:00.572 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[4efe069c-cf7b-403d-9f18-d52e583d726c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:00.606 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[ca6bddca-09fa-4d7c-90ea-8bd9adcefd19]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:00.613 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[55c920b3-4fd4-4881-bccf-282630a631bb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:13:00 compute-0 systemd-udevd[219618]: Network interface NamePolicy= disabled on kernel command line.
Oct 01 14:13:00 compute-0 NetworkManager[51741]: <info>  [1759327980.6154] manager: (tap8562f9c0-00): new Veth device (/org/freedesktop/NetworkManager/Devices/44)
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:00.672 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[8223a00e-0232-4c53-afb8-29c538d193ce]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:00.676 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[e9c1322d-ae52-4ac2-aabb-e5e65b1663f6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:13:00 compute-0 NetworkManager[51741]: <info>  [1759327980.7092] device (tap8562f9c0-00): carrier: link connected
Oct 01 14:13:00 compute-0 nova_compute[192698]: 2025-10-01 14:13:00.718 2 DEBUG nova.compute.manager [req-50e01076-918d-40e4-ac9a-c4c00ef75f8c req-7c832c32-829e-4103-bbc5-a6ee9ba7949e 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] Received event network-vif-plugged-1f41468b-a36d-4ea0-bf4c-26b26778c31d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:13:00 compute-0 nova_compute[192698]: 2025-10-01 14:13:00.718 2 DEBUG oslo_concurrency.lockutils [req-50e01076-918d-40e4-ac9a-c4c00ef75f8c req-7c832c32-829e-4103-bbc5-a6ee9ba7949e 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "9678ec54-31c4-4d96-a3f0-96686482f8b8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:13:00 compute-0 nova_compute[192698]: 2025-10-01 14:13:00.719 2 DEBUG oslo_concurrency.lockutils [req-50e01076-918d-40e4-ac9a-c4c00ef75f8c req-7c832c32-829e-4103-bbc5-a6ee9ba7949e 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "9678ec54-31c4-4d96-a3f0-96686482f8b8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:13:00 compute-0 nova_compute[192698]: 2025-10-01 14:13:00.720 2 DEBUG oslo_concurrency.lockutils [req-50e01076-918d-40e4-ac9a-c4c00ef75f8c req-7c832c32-829e-4103-bbc5-a6ee9ba7949e 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "9678ec54-31c4-4d96-a3f0-96686482f8b8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:13:00 compute-0 nova_compute[192698]: 2025-10-01 14:13:00.720 2 DEBUG nova.compute.manager [req-50e01076-918d-40e4-ac9a-c4c00ef75f8c req-7c832c32-829e-4103-bbc5-a6ee9ba7949e 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] Processing event network-vif-plugged-1f41468b-a36d-4ea0-bf4c-26b26778c31d _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:00.721 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[0c327abb-52bf-4e3d-b100-6e9aaa87bf50]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:00.746 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[5265f211-f9ce-4997-ae57-b674533ad214]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8562f9c0-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f4:ed:77'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 31], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 434124, 'reachable_time': 33411, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219647, 'error': None, 'target': 'ovnmeta-8562f9c0-0a2b-4e53-975b-dd543293c802', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:00.768 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[5d63c2e1-d3e7-405f-8513-a185e950767d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef4:ed77'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 434124, 'tstamp': 434124}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219648, 'error': None, 'target': 'ovnmeta-8562f9c0-0a2b-4e53-975b-dd543293c802', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:00.793 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[c305385e-9429-43d5-a209-915663545d33]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8562f9c0-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f4:ed:77'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 31], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 434124, 'reachable_time': 33411, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219649, 'error': None, 'target': 'ovnmeta-8562f9c0-0a2b-4e53-975b-dd543293c802', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:00.827 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[d77f9a3c-11a8-41c5-b247-8279566ae03a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:00.892 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[394122fc-382b-4471-88bc-5c175dda6e39]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:00.894 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8562f9c0-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:00.894 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:00.894 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8562f9c0-00, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:13:00 compute-0 nova_compute[192698]: 2025-10-01 14:13:00.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:00 compute-0 NetworkManager[51741]: <info>  [1759327980.8984] manager: (tap8562f9c0-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/45)
Oct 01 14:13:00 compute-0 kernel: tap8562f9c0-00: entered promiscuous mode
Oct 01 14:13:00 compute-0 nova_compute[192698]: 2025-10-01 14:13:00.904 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:00.904 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8562f9c0-00, col_values=(('external_ids', {'iface-id': 'b5ee4d88-5d32-4dfa-ae97-c0c0976243b5'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:13:00 compute-0 nova_compute[192698]: 2025-10-01 14:13:00.906 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:00 compute-0 ovn_controller[94909]: 2025-10-01T14:13:00Z|00102|binding|INFO|Releasing lport b5ee4d88-5d32-4dfa-ae97-c0c0976243b5 from this chassis (sb_readonly=0)
Oct 01 14:13:00 compute-0 nova_compute[192698]: 2025-10-01 14:13:00.933 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:00 compute-0 nova_compute[192698]: 2025-10-01 14:13:00.934 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:00.936 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[12c4c2cd-c6f4-4854-97ea-1ce5220de4cc]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:00.937 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8562f9c0-0a2b-4e53-975b-dd543293c802.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8562f9c0-0a2b-4e53-975b-dd543293c802.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:00.937 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8562f9c0-0a2b-4e53-975b-dd543293c802.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8562f9c0-0a2b-4e53-975b-dd543293c802.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:00.937 103791 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 8562f9c0-0a2b-4e53-975b-dd543293c802 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:00.937 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8562f9c0-0a2b-4e53-975b-dd543293c802.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8562f9c0-0a2b-4e53-975b-dd543293c802.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:00.938 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[99ee358a-5f90-4202-908d-c05e60c7b6ef]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:00.938 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8562f9c0-0a2b-4e53-975b-dd543293c802.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8562f9c0-0a2b-4e53-975b-dd543293c802.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:00.939 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[4ea04c96-76cc-4cfc-9ecc-a19a307091d9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:00.939 103791 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]: global
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]:     log         /dev/log local0 debug
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]:     log-tag     haproxy-metadata-proxy-8562f9c0-0a2b-4e53-975b-dd543293c802
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]:     user        root
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]:     group       root
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]:     maxconn     1024
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]:     pidfile     /var/lib/neutron/external/pids/8562f9c0-0a2b-4e53-975b-dd543293c802.pid.haproxy
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]:     daemon
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]: 
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]: defaults
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]:     log global
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]:     mode http
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]:     option httplog
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]:     option dontlognull
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]:     option http-server-close
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]:     option forwardfor
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]:     retries                 3
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]:     timeout http-request    30s
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]:     timeout connect         30s
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]:     timeout client          32s
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]:     timeout server          32s
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]:     timeout http-keep-alive 30s
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]: 
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]: listen listener
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]:     bind 169.254.169.254:80
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]:     
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]:     server metadata /var/lib/neutron/metadata_proxy
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]: 
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]:     http-request add-header X-OVN-Network-ID 8562f9c0-0a2b-4e53-975b-dd543293c802
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Oct 01 14:13:00 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:00.940 103791 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8562f9c0-0a2b-4e53-975b-dd543293c802', 'env', 'PROCESS_TAG=haproxy-8562f9c0-0a2b-4e53-975b-dd543293c802', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8562f9c0-0a2b-4e53-975b-dd543293c802.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Oct 01 14:13:01 compute-0 podman[219678]: 2025-10-01 14:13:01.399872635 +0000 UTC m=+0.094983251 container create d3085196b551add75580ee148a204cb396373e4944aa95bb931ae257ed1e9888 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-8562f9c0-0a2b-4e53-975b-dd543293c802, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 01 14:13:01 compute-0 openstack_network_exporter[205307]: ERROR   14:13:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:13:01 compute-0 openstack_network_exporter[205307]: ERROR   14:13:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:13:01 compute-0 openstack_network_exporter[205307]: ERROR   14:13:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:13:01 compute-0 podman[219678]: 2025-10-01 14:13:01.352438762 +0000 UTC m=+0.047549418 image pull 0c139338a67144a0d88e07ef5f38b20d3085af4a1586fd8115d3776c8f9c633c 38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Oct 01 14:13:01 compute-0 openstack_network_exporter[205307]: ERROR   14:13:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:13:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:13:01 compute-0 openstack_network_exporter[205307]: ERROR   14:13:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:13:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:13:01 compute-0 systemd[1]: Started libpod-conmon-d3085196b551add75580ee148a204cb396373e4944aa95bb931ae257ed1e9888.scope.
Oct 01 14:13:01 compute-0 systemd[1]: Started libcrun container.
Oct 01 14:13:01 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/112fa9c377ba64626677f973a8fb3ddbb7886222e840d5840ce7fe7bc22c3a02/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 01 14:13:01 compute-0 podman[219678]: 2025-10-01 14:13:01.53719171 +0000 UTC m=+0.232302306 container init d3085196b551add75580ee148a204cb396373e4944aa95bb931ae257ed1e9888 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-8562f9c0-0a2b-4e53-975b-dd543293c802, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 01 14:13:01 compute-0 podman[219678]: 2025-10-01 14:13:01.545914406 +0000 UTC m=+0.241024982 container start d3085196b551add75580ee148a204cb396373e4944aa95bb931ae257ed1e9888 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-8562f9c0-0a2b-4e53-975b-dd543293c802, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct 01 14:13:01 compute-0 neutron-haproxy-ovnmeta-8562f9c0-0a2b-4e53-975b-dd543293c802[219693]: [NOTICE]   (219697) : New worker (219699) forked
Oct 01 14:13:01 compute-0 neutron-haproxy-ovnmeta-8562f9c0-0a2b-4e53-975b-dd543293c802[219693]: [NOTICE]   (219697) : Loading success.
Oct 01 14:13:01 compute-0 anacron[1070]: Job `cron.monthly' started
Oct 01 14:13:01 compute-0 anacron[1070]: Job `cron.monthly' terminated
Oct 01 14:13:01 compute-0 anacron[1070]: Normal exit (3 jobs run)
Oct 01 14:13:02 compute-0 nova_compute[192698]: 2025-10-01 14:13:02.142 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:02 compute-0 nova_compute[192698]: 2025-10-01 14:13:02.350 2 DEBUG nova.compute.manager [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Oct 01 14:13:02 compute-0 nova_compute[192698]: 2025-10-01 14:13:02.354 2 DEBUG nova.virt.libvirt.driver [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Oct 01 14:13:02 compute-0 nova_compute[192698]: 2025-10-01 14:13:02.358 2 INFO nova.virt.libvirt.driver [-] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] Instance spawned successfully.
Oct 01 14:13:02 compute-0 nova_compute[192698]: 2025-10-01 14:13:02.359 2 DEBUG nova.virt.libvirt.driver [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Oct 01 14:13:02 compute-0 nova_compute[192698]: 2025-10-01 14:13:02.768 2 DEBUG nova.compute.manager [req-c0aa1d9d-d9cb-4224-a98d-f0dff3c59093 req-42e74281-0d23-42de-adac-9ef3928e7799 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] Received event network-vif-plugged-1f41468b-a36d-4ea0-bf4c-26b26778c31d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:13:02 compute-0 nova_compute[192698]: 2025-10-01 14:13:02.768 2 DEBUG oslo_concurrency.lockutils [req-c0aa1d9d-d9cb-4224-a98d-f0dff3c59093 req-42e74281-0d23-42de-adac-9ef3928e7799 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "9678ec54-31c4-4d96-a3f0-96686482f8b8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:13:02 compute-0 nova_compute[192698]: 2025-10-01 14:13:02.769 2 DEBUG oslo_concurrency.lockutils [req-c0aa1d9d-d9cb-4224-a98d-f0dff3c59093 req-42e74281-0d23-42de-adac-9ef3928e7799 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "9678ec54-31c4-4d96-a3f0-96686482f8b8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:13:02 compute-0 nova_compute[192698]: 2025-10-01 14:13:02.769 2 DEBUG oslo_concurrency.lockutils [req-c0aa1d9d-d9cb-4224-a98d-f0dff3c59093 req-42e74281-0d23-42de-adac-9ef3928e7799 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "9678ec54-31c4-4d96-a3f0-96686482f8b8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:13:02 compute-0 nova_compute[192698]: 2025-10-01 14:13:02.769 2 DEBUG nova.compute.manager [req-c0aa1d9d-d9cb-4224-a98d-f0dff3c59093 req-42e74281-0d23-42de-adac-9ef3928e7799 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] No waiting events found dispatching network-vif-plugged-1f41468b-a36d-4ea0-bf4c-26b26778c31d pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:13:02 compute-0 nova_compute[192698]: 2025-10-01 14:13:02.769 2 WARNING nova.compute.manager [req-c0aa1d9d-d9cb-4224-a98d-f0dff3c59093 req-42e74281-0d23-42de-adac-9ef3928e7799 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] Received unexpected event network-vif-plugged-1f41468b-a36d-4ea0-bf4c-26b26778c31d for instance with vm_state building and task_state spawning.
Oct 01 14:13:02 compute-0 nova_compute[192698]: 2025-10-01 14:13:02.884 2 DEBUG nova.virt.libvirt.driver [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:13:02 compute-0 nova_compute[192698]: 2025-10-01 14:13:02.884 2 DEBUG nova.virt.libvirt.driver [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:13:02 compute-0 nova_compute[192698]: 2025-10-01 14:13:02.885 2 DEBUG nova.virt.libvirt.driver [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:13:02 compute-0 nova_compute[192698]: 2025-10-01 14:13:02.885 2 DEBUG nova.virt.libvirt.driver [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:13:02 compute-0 nova_compute[192698]: 2025-10-01 14:13:02.886 2 DEBUG nova.virt.libvirt.driver [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:13:02 compute-0 nova_compute[192698]: 2025-10-01 14:13:02.886 2 DEBUG nova.virt.libvirt.driver [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:13:03 compute-0 nova_compute[192698]: 2025-10-01 14:13:03.263 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:03 compute-0 nova_compute[192698]: 2025-10-01 14:13:03.396 2 INFO nova.compute.manager [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] Took 12.39 seconds to spawn the instance on the hypervisor.
Oct 01 14:13:03 compute-0 nova_compute[192698]: 2025-10-01 14:13:03.397 2 DEBUG nova.compute.manager [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 01 14:13:03 compute-0 nova_compute[192698]: 2025-10-01 14:13:03.933 2 INFO nova.compute.manager [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] Took 17.63 seconds to build instance.
Oct 01 14:13:04 compute-0 nova_compute[192698]: 2025-10-01 14:13:04.440 2 DEBUG oslo_concurrency.lockutils [None req-7b91daa9-8969-48e9-ae45-8d49b49ee664 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Lock "9678ec54-31c4-4d96-a3f0-96686482f8b8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.156s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:13:05 compute-0 podman[219717]: 2025-10-01 14:13:05.165359378 +0000 UTC m=+0.081864856 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.openshift.expose-services=, config_id=edpm, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, name=ubi9-minimal, version=9.6, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container)
Oct 01 14:13:07 compute-0 nova_compute[192698]: 2025-10-01 14:13:07.146 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:08 compute-0 nova_compute[192698]: 2025-10-01 14:13:08.267 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:11 compute-0 podman[219740]: 2025-10-01 14:13:11.189914278 +0000 UTC m=+0.097964091 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Oct 01 14:13:11 compute-0 podman[219739]: 2025-10-01 14:13:11.20478846 +0000 UTC m=+0.110295805 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4)
Oct 01 14:13:11 compute-0 nova_compute[192698]: 2025-10-01 14:13:11.927 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:13:11 compute-0 nova_compute[192698]: 2025-10-01 14:13:11.929 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:13:11 compute-0 nova_compute[192698]: 2025-10-01 14:13:11.930 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Oct 01 14:13:12 compute-0 nova_compute[192698]: 2025-10-01 14:13:12.148 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:13 compute-0 nova_compute[192698]: 2025-10-01 14:13:13.271 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:13 compute-0 nova_compute[192698]: 2025-10-01 14:13:13.443 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:13:13 compute-0 ovn_controller[94909]: 2025-10-01T14:13:13Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:df:81:60 10.100.0.5
Oct 01 14:13:13 compute-0 ovn_controller[94909]: 2025-10-01T14:13:13Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:df:81:60 10.100.0.5
Oct 01 14:13:13 compute-0 nova_compute[192698]: 2025-10-01 14:13:13.959 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:13:13 compute-0 nova_compute[192698]: 2025-10-01 14:13:13.960 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:13:13 compute-0 nova_compute[192698]: 2025-10-01 14:13:13.960 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:13:13 compute-0 nova_compute[192698]: 2025-10-01 14:13:13.961 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 01 14:13:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:14.253 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:13:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:14.254 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:13:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:14.256 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:13:15 compute-0 nova_compute[192698]: 2025-10-01 14:13:15.017 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9678ec54-31c4-4d96-a3f0-96686482f8b8/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:13:15 compute-0 nova_compute[192698]: 2025-10-01 14:13:15.107 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9678ec54-31c4-4d96-a3f0-96686482f8b8/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:13:15 compute-0 nova_compute[192698]: 2025-10-01 14:13:15.108 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9678ec54-31c4-4d96-a3f0-96686482f8b8/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:13:15 compute-0 nova_compute[192698]: 2025-10-01 14:13:15.216 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9678ec54-31c4-4d96-a3f0-96686482f8b8/disk --force-share --output=json" returned: 0 in 0.108s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:13:15 compute-0 nova_compute[192698]: 2025-10-01 14:13:15.457 2 WARNING nova.virt.libvirt.driver [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:13:15 compute-0 nova_compute[192698]: 2025-10-01 14:13:15.459 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:13:15 compute-0 nova_compute[192698]: 2025-10-01 14:13:15.503 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.043s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:13:15 compute-0 nova_compute[192698]: 2025-10-01 14:13:15.503 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5701MB free_disk=73.27356338500977GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 01 14:13:15 compute-0 nova_compute[192698]: 2025-10-01 14:13:15.504 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:13:15 compute-0 nova_compute[192698]: 2025-10-01 14:13:15.504 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:13:16 compute-0 podman[219801]: 2025-10-01 14:13:16.165918789 +0000 UTC m=+0.072947024 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 01 14:13:16 compute-0 nova_compute[192698]: 2025-10-01 14:13:16.552 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Instance 9678ec54-31c4-4d96-a3f0-96686482f8b8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Oct 01 14:13:17 compute-0 nova_compute[192698]: 2025-10-01 14:13:17.061 2 WARNING nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Instance 35606c68-c638-48b1-bf80-6235d971579d has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Oct 01 14:13:17 compute-0 nova_compute[192698]: 2025-10-01 14:13:17.061 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 01 14:13:17 compute-0 nova_compute[192698]: 2025-10-01 14:13:17.062 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:13:15 up  1:12,  0 user,  load average: 0.33, 0.25, 0.39\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_9f5565c36a294928af6bcd073bff4643': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 01 14:13:17 compute-0 nova_compute[192698]: 2025-10-01 14:13:17.093 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Refreshing inventories for resource provider ee1e54f5-453b-4949-a499-9a192f03b8f0 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Oct 01 14:13:17 compute-0 nova_compute[192698]: 2025-10-01 14:13:17.111 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Updating ProviderTree inventory for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Oct 01 14:13:17 compute-0 nova_compute[192698]: 2025-10-01 14:13:17.112 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Updating inventory in ProviderTree for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Oct 01 14:13:17 compute-0 nova_compute[192698]: 2025-10-01 14:13:17.131 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Refreshing aggregate associations for resource provider ee1e54f5-453b-4949-a499-9a192f03b8f0, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Oct 01 14:13:17 compute-0 nova_compute[192698]: 2025-10-01 14:13:17.152 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:17 compute-0 nova_compute[192698]: 2025-10-01 14:13:17.165 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Refreshing trait associations for resource provider ee1e54f5-453b-4949-a499-9a192f03b8f0, traits: COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_TPM_TIS,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_BMI2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_AVX2,COMPUTE_ARCH_X86_64,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SHA,COMPUTE_SOUND_MODEL_AC97,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_SOUND_MODEL_ES1370,HW_ARCH_X86_64,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_DEVICE_TAGGING,COMPUTE_SECURITY_TPM_CRB,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SOUND_MODEL_SB16,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SOUND_MODEL_USB,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_ACCELERATORS,HW_CPU_X86_CLMUL,HW_CPU_X86_AESNI,COMPUTE_NODE,HW_CPU_X86_SSSE3,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_RESCUE_BFV,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_FMA3,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_AVX,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_F16C,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_ABM,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_VIRTIO_FS,HW_CPU_X86_SSE2,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE4A,HW_CPU_X86_SVM _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Oct 01 14:13:17 compute-0 nova_compute[192698]: 2025-10-01 14:13:17.231 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:13:17 compute-0 nova_compute[192698]: 2025-10-01 14:13:17.743 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:13:18 compute-0 nova_compute[192698]: 2025-10-01 14:13:18.031 2 DEBUG nova.virt.libvirt.driver [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 35606c68-c638-48b1-bf80-6235d971579d] Creating tmpfile /var/lib/nova/instances/tmpg600rjkp to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Oct 01 14:13:18 compute-0 nova_compute[192698]: 2025-10-01 14:13:18.033 2 WARNING neutronclient.v2_0.client [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:13:18 compute-0 nova_compute[192698]: 2025-10-01 14:13:18.039 2 DEBUG nova.compute.manager [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpg600rjkp',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Oct 01 14:13:18 compute-0 nova_compute[192698]: 2025-10-01 14:13:18.253 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 01 14:13:18 compute-0 nova_compute[192698]: 2025-10-01 14:13:18.254 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.749s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:13:18 compute-0 nova_compute[192698]: 2025-10-01 14:13:18.254 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:13:18 compute-0 nova_compute[192698]: 2025-10-01 14:13:18.274 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:19 compute-0 nova_compute[192698]: 2025-10-01 14:13:19.245 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:13:19 compute-0 nova_compute[192698]: 2025-10-01 14:13:19.246 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:13:19 compute-0 nova_compute[192698]: 2025-10-01 14:13:19.246 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:13:19 compute-0 nova_compute[192698]: 2025-10-01 14:13:19.247 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:13:19 compute-0 nova_compute[192698]: 2025-10-01 14:13:19.915 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:13:20 compute-0 nova_compute[192698]: 2025-10-01 14:13:20.065 2 WARNING neutronclient.v2_0.client [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:13:20 compute-0 nova_compute[192698]: 2025-10-01 14:13:20.428 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:13:22 compute-0 nova_compute[192698]: 2025-10-01 14:13:22.154 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:22 compute-0 nova_compute[192698]: 2025-10-01 14:13:22.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:13:22 compute-0 nova_compute[192698]: 2025-10-01 14:13:22.925 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 01 14:13:23 compute-0 nova_compute[192698]: 2025-10-01 14:13:23.277 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:24 compute-0 nova_compute[192698]: 2025-10-01 14:13:24.082 2 DEBUG nova.compute.manager [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpg600rjkp',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='35606c68-c638-48b1-bf80-6235d971579d',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Oct 01 14:13:25 compute-0 nova_compute[192698]: 2025-10-01 14:13:25.102 2 DEBUG oslo_concurrency.lockutils [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "refresh_cache-35606c68-c638-48b1-bf80-6235d971579d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 01 14:13:25 compute-0 nova_compute[192698]: 2025-10-01 14:13:25.103 2 DEBUG oslo_concurrency.lockutils [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquired lock "refresh_cache-35606c68-c638-48b1-bf80-6235d971579d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 01 14:13:25 compute-0 nova_compute[192698]: 2025-10-01 14:13:25.103 2 DEBUG nova.network.neutron [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 35606c68-c638-48b1-bf80-6235d971579d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 01 14:13:25 compute-0 nova_compute[192698]: 2025-10-01 14:13:25.611 2 WARNING neutronclient.v2_0.client [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:13:26 compute-0 nova_compute[192698]: 2025-10-01 14:13:26.130 2 WARNING neutronclient.v2_0.client [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:13:26 compute-0 nova_compute[192698]: 2025-10-01 14:13:26.262 2 DEBUG nova.network.neutron [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 35606c68-c638-48b1-bf80-6235d971579d] Updating instance_info_cache with network_info: [{"id": "1ead854c-84de-4a64-a4bf-8c9c51f98e13", "address": "fa:16:3e:cb:92:e9", "network": {"id": "8562f9c0-0a2b-4e53-975b-dd543293c802", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1048948457-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8120df3906db49b8ac8fa624e2f2aad4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ead854c-84", "ovs_interfaceid": "1ead854c-84de-4a64-a4bf-8c9c51f98e13", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:13:26 compute-0 nova_compute[192698]: 2025-10-01 14:13:26.768 2 DEBUG oslo_concurrency.lockutils [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Releasing lock "refresh_cache-35606c68-c638-48b1-bf80-6235d971579d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 01 14:13:26 compute-0 nova_compute[192698]: 2025-10-01 14:13:26.786 2 DEBUG nova.virt.libvirt.driver [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 35606c68-c638-48b1-bf80-6235d971579d] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpg600rjkp',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='35606c68-c638-48b1-bf80-6235d971579d',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Oct 01 14:13:26 compute-0 nova_compute[192698]: 2025-10-01 14:13:26.788 2 DEBUG nova.virt.libvirt.driver [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 35606c68-c638-48b1-bf80-6235d971579d] Creating instance directory: /var/lib/nova/instances/35606c68-c638-48b1-bf80-6235d971579d pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Oct 01 14:13:26 compute-0 nova_compute[192698]: 2025-10-01 14:13:26.789 2 DEBUG nova.virt.libvirt.driver [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 35606c68-c638-48b1-bf80-6235d971579d] Creating disk.info with the contents: {'/var/lib/nova/instances/35606c68-c638-48b1-bf80-6235d971579d/disk': 'qcow2', '/var/lib/nova/instances/35606c68-c638-48b1-bf80-6235d971579d/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Oct 01 14:13:26 compute-0 nova_compute[192698]: 2025-10-01 14:13:26.789 2 DEBUG nova.virt.libvirt.driver [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 35606c68-c638-48b1-bf80-6235d971579d] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Oct 01 14:13:26 compute-0 nova_compute[192698]: 2025-10-01 14:13:26.790 2 DEBUG nova.objects.instance [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 35606c68-c638-48b1-bf80-6235d971579d obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:13:27 compute-0 nova_compute[192698]: 2025-10-01 14:13:27.159 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:27 compute-0 nova_compute[192698]: 2025-10-01 14:13:27.298 2 DEBUG oslo_utils.imageutils.format_inspector [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:13:27 compute-0 nova_compute[192698]: 2025-10-01 14:13:27.305 2 DEBUG oslo_utils.imageutils.format_inspector [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:13:27 compute-0 nova_compute[192698]: 2025-10-01 14:13:27.308 2 DEBUG oslo_concurrency.processutils [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:13:27 compute-0 nova_compute[192698]: 2025-10-01 14:13:27.389 2 DEBUG oslo_concurrency.processutils [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:13:27 compute-0 nova_compute[192698]: 2025-10-01 14:13:27.391 2 DEBUG oslo_concurrency.lockutils [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "f477473ce09fdc00484ca839f539813eb2fee546" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:13:27 compute-0 nova_compute[192698]: 2025-10-01 14:13:27.392 2 DEBUG oslo_concurrency.lockutils [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "f477473ce09fdc00484ca839f539813eb2fee546" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:13:27 compute-0 nova_compute[192698]: 2025-10-01 14:13:27.393 2 DEBUG oslo_utils.imageutils.format_inspector [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:13:27 compute-0 nova_compute[192698]: 2025-10-01 14:13:27.400 2 DEBUG oslo_utils.imageutils.format_inspector [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:13:27 compute-0 nova_compute[192698]: 2025-10-01 14:13:27.401 2 DEBUG oslo_concurrency.processutils [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:13:27 compute-0 nova_compute[192698]: 2025-10-01 14:13:27.464 2 DEBUG oslo_concurrency.processutils [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:13:27 compute-0 nova_compute[192698]: 2025-10-01 14:13:27.466 2 DEBUG oslo_concurrency.processutils [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546,backing_fmt=raw /var/lib/nova/instances/35606c68-c638-48b1-bf80-6235d971579d/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:13:27 compute-0 nova_compute[192698]: 2025-10-01 14:13:27.501 2 DEBUG oslo_concurrency.processutils [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546,backing_fmt=raw /var/lib/nova/instances/35606c68-c638-48b1-bf80-6235d971579d/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:13:27 compute-0 nova_compute[192698]: 2025-10-01 14:13:27.502 2 DEBUG oslo_concurrency.lockutils [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "f477473ce09fdc00484ca839f539813eb2fee546" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.110s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:13:27 compute-0 nova_compute[192698]: 2025-10-01 14:13:27.503 2 DEBUG oslo_concurrency.processutils [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:13:27 compute-0 nova_compute[192698]: 2025-10-01 14:13:27.558 2 DEBUG oslo_concurrency.processutils [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:13:27 compute-0 nova_compute[192698]: 2025-10-01 14:13:27.559 2 DEBUG nova.virt.disk.api [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Checking if we can resize image /var/lib/nova/instances/35606c68-c638-48b1-bf80-6235d971579d/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 01 14:13:27 compute-0 nova_compute[192698]: 2025-10-01 14:13:27.560 2 DEBUG oslo_concurrency.processutils [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/35606c68-c638-48b1-bf80-6235d971579d/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:13:27 compute-0 nova_compute[192698]: 2025-10-01 14:13:27.617 2 DEBUG oslo_concurrency.processutils [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/35606c68-c638-48b1-bf80-6235d971579d/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:13:27 compute-0 nova_compute[192698]: 2025-10-01 14:13:27.618 2 DEBUG nova.virt.disk.api [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Cannot resize image /var/lib/nova/instances/35606c68-c638-48b1-bf80-6235d971579d/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 01 14:13:27 compute-0 nova_compute[192698]: 2025-10-01 14:13:27.619 2 DEBUG nova.objects.instance [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lazy-loading 'migration_context' on Instance uuid 35606c68-c638-48b1-bf80-6235d971579d obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:13:27 compute-0 nova_compute[192698]: 2025-10-01 14:13:27.926 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:13:27 compute-0 nova_compute[192698]: 2025-10-01 14:13:27.926 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Oct 01 14:13:28 compute-0 podman[219840]: 2025-10-01 14:13:28.142071566 +0000 UTC m=+0.059102480 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Oct 01 14:13:28 compute-0 nova_compute[192698]: 2025-10-01 14:13:28.199 2 DEBUG nova.objects.base [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Object Instance<35606c68-c638-48b1-bf80-6235d971579d> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Oct 01 14:13:28 compute-0 nova_compute[192698]: 2025-10-01 14:13:28.200 2 DEBUG oslo_concurrency.processutils [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/35606c68-c638-48b1-bf80-6235d971579d/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:13:28 compute-0 nova_compute[192698]: 2025-10-01 14:13:28.228 2 DEBUG oslo_concurrency.processutils [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/35606c68-c638-48b1-bf80-6235d971579d/disk.config 497664" returned: 0 in 0.028s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:13:28 compute-0 nova_compute[192698]: 2025-10-01 14:13:28.229 2 DEBUG nova.virt.libvirt.driver [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 35606c68-c638-48b1-bf80-6235d971579d] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Oct 01 14:13:28 compute-0 nova_compute[192698]: 2025-10-01 14:13:28.230 2 DEBUG nova.virt.libvirt.vif [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-10-01T14:12:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-736323948',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-736323948',id=12,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-01T14:12:40Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='9f5565c36a294928af6bcd073bff4643',ramdisk_id='',reservation_id='r-4dxkp51y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-132658549',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-132658549-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-10-01T14:12:40Z,user_data=None,user_id='8e4b771b5757444093151a3e38c0b2d7',uuid=35606c68-c638-48b1-bf80-6235d971579d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1ead854c-84de-4a64-a4bf-8c9c51f98e13", "address": "fa:16:3e:cb:92:e9", "network": {"id": "8562f9c0-0a2b-4e53-975b-dd543293c802", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1048948457-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8120df3906db49b8ac8fa624e2f2aad4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap1ead854c-84", "ovs_interfaceid": "1ead854c-84de-4a64-a4bf-8c9c51f98e13", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 01 14:13:28 compute-0 nova_compute[192698]: 2025-10-01 14:13:28.231 2 DEBUG nova.network.os_vif_util [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Converting VIF {"id": "1ead854c-84de-4a64-a4bf-8c9c51f98e13", "address": "fa:16:3e:cb:92:e9", "network": {"id": "8562f9c0-0a2b-4e53-975b-dd543293c802", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1048948457-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8120df3906db49b8ac8fa624e2f2aad4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap1ead854c-84", "ovs_interfaceid": "1ead854c-84de-4a64-a4bf-8c9c51f98e13", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:13:28 compute-0 nova_compute[192698]: 2025-10-01 14:13:28.232 2 DEBUG nova.network.os_vif_util [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cb:92:e9,bridge_name='br-int',has_traffic_filtering=True,id=1ead854c-84de-4a64-a4bf-8c9c51f98e13,network=Network(8562f9c0-0a2b-4e53-975b-dd543293c802),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1ead854c-84') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:13:28 compute-0 nova_compute[192698]: 2025-10-01 14:13:28.233 2 DEBUG os_vif [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cb:92:e9,bridge_name='br-int',has_traffic_filtering=True,id=1ead854c-84de-4a64-a4bf-8c9c51f98e13,network=Network(8562f9c0-0a2b-4e53-975b-dd543293c802),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1ead854c-84') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 01 14:13:28 compute-0 nova_compute[192698]: 2025-10-01 14:13:28.233 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:28 compute-0 nova_compute[192698]: 2025-10-01 14:13:28.234 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:13:28 compute-0 nova_compute[192698]: 2025-10-01 14:13:28.235 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:13:28 compute-0 nova_compute[192698]: 2025-10-01 14:13:28.237 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:28 compute-0 nova_compute[192698]: 2025-10-01 14:13:28.238 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '5afb05b6-dcd0-5780-93bc-d1c8f79c2b6a', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:13:28 compute-0 podman[219841]: 2025-10-01 14:13:28.238032942 +0000 UTC m=+0.146008781 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930)
Oct 01 14:13:28 compute-0 nova_compute[192698]: 2025-10-01 14:13:28.239 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:28 compute-0 nova_compute[192698]: 2025-10-01 14:13:28.240 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:28 compute-0 nova_compute[192698]: 2025-10-01 14:13:28.242 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:13:28 compute-0 nova_compute[192698]: 2025-10-01 14:13:28.242 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:28 compute-0 nova_compute[192698]: 2025-10-01 14:13:28.245 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:28 compute-0 nova_compute[192698]: 2025-10-01 14:13:28.245 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1ead854c-84, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:13:28 compute-0 nova_compute[192698]: 2025-10-01 14:13:28.246 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap1ead854c-84, col_values=(('qos', UUID('15a6322d-ace5-4af3-8ecd-39cb3284e986')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:13:28 compute-0 nova_compute[192698]: 2025-10-01 14:13:28.246 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap1ead854c-84, col_values=(('external_ids', {'iface-id': '1ead854c-84de-4a64-a4bf-8c9c51f98e13', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cb:92:e9', 'vm-uuid': '35606c68-c638-48b1-bf80-6235d971579d'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:13:28 compute-0 nova_compute[192698]: 2025-10-01 14:13:28.248 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:28 compute-0 NetworkManager[51741]: <info>  [1759328008.2483] manager: (tap1ead854c-84): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/46)
Oct 01 14:13:28 compute-0 nova_compute[192698]: 2025-10-01 14:13:28.250 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:13:28 compute-0 nova_compute[192698]: 2025-10-01 14:13:28.256 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:28 compute-0 nova_compute[192698]: 2025-10-01 14:13:28.257 2 INFO os_vif [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cb:92:e9,bridge_name='br-int',has_traffic_filtering=True,id=1ead854c-84de-4a64-a4bf-8c9c51f98e13,network=Network(8562f9c0-0a2b-4e53-975b-dd543293c802),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1ead854c-84')
Oct 01 14:13:28 compute-0 nova_compute[192698]: 2025-10-01 14:13:28.258 2 DEBUG nova.virt.libvirt.driver [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Oct 01 14:13:28 compute-0 nova_compute[192698]: 2025-10-01 14:13:28.258 2 DEBUG nova.compute.manager [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpg600rjkp',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='35606c68-c638-48b1-bf80-6235d971579d',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Oct 01 14:13:28 compute-0 nova_compute[192698]: 2025-10-01 14:13:28.259 2 WARNING neutronclient.v2_0.client [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:13:28 compute-0 nova_compute[192698]: 2025-10-01 14:13:28.279 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:28 compute-0 nova_compute[192698]: 2025-10-01 14:13:28.336 2 WARNING neutronclient.v2_0.client [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:13:28 compute-0 nova_compute[192698]: 2025-10-01 14:13:28.435 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Oct 01 14:13:29 compute-0 nova_compute[192698]: 2025-10-01 14:13:29.260 2 DEBUG nova.network.neutron [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 35606c68-c638-48b1-bf80-6235d971579d] Port 1ead854c-84de-4a64-a4bf-8c9c51f98e13 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Oct 01 14:13:29 compute-0 nova_compute[192698]: 2025-10-01 14:13:29.314 2 DEBUG nova.compute.manager [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpg600rjkp',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='35606c68-c638-48b1-bf80-6235d971579d',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Oct 01 14:13:29 compute-0 podman[203144]: time="2025-10-01T14:13:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:13:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:13:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20750 "" "Go-http-client/1.1"
Oct 01 14:13:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:13:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3487 "" "Go-http-client/1.1"
Oct 01 14:13:30 compute-0 ovn_controller[94909]: 2025-10-01T14:13:30Z|00103|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Oct 01 14:13:31 compute-0 openstack_network_exporter[205307]: ERROR   14:13:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:13:31 compute-0 openstack_network_exporter[205307]: ERROR   14:13:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:13:31 compute-0 openstack_network_exporter[205307]: ERROR   14:13:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:13:31 compute-0 openstack_network_exporter[205307]: ERROR   14:13:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:13:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:13:31 compute-0 openstack_network_exporter[205307]: ERROR   14:13:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:13:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:13:32 compute-0 systemd[1]: Starting libvirt proxy daemon...
Oct 01 14:13:32 compute-0 systemd[1]: Started libvirt proxy daemon.
Oct 01 14:13:32 compute-0 NetworkManager[51741]: <info>  [1759328012.4512] manager: (tap1ead854c-84): new Tun device (/org/freedesktop/NetworkManager/Devices/47)
Oct 01 14:13:32 compute-0 kernel: tap1ead854c-84: entered promiscuous mode
Oct 01 14:13:32 compute-0 ovn_controller[94909]: 2025-10-01T14:13:32Z|00104|binding|INFO|Claiming lport 1ead854c-84de-4a64-a4bf-8c9c51f98e13 for this additional chassis.
Oct 01 14:13:32 compute-0 ovn_controller[94909]: 2025-10-01T14:13:32Z|00105|binding|INFO|1ead854c-84de-4a64-a4bf-8c9c51f98e13: Claiming fa:16:3e:cb:92:e9 10.100.0.6
Oct 01 14:13:32 compute-0 nova_compute[192698]: 2025-10-01 14:13:32.454 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:32 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:32.461 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cb:92:e9 10.100.0.6'], port_security=['fa:16:3e:cb:92:e9 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '35606c68-c638-48b1-bf80-6235d971579d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8562f9c0-0a2b-4e53-975b-dd543293c802', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9f5565c36a294928af6bcd073bff4643', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'd07d6cb5-684b-4a4b-83f2-c6fbca49c797', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=18a05557-2e37-4ffc-9c62-b55a7756059d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=1ead854c-84de-4a64-a4bf-8c9c51f98e13) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:13:32 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:32.462 103791 INFO neutron.agent.ovn.metadata.agent [-] Port 1ead854c-84de-4a64-a4bf-8c9c51f98e13 in datapath 8562f9c0-0a2b-4e53-975b-dd543293c802 unbound from our chassis
Oct 01 14:13:32 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:32.464 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8562f9c0-0a2b-4e53-975b-dd543293c802
Oct 01 14:13:32 compute-0 ovn_controller[94909]: 2025-10-01T14:13:32Z|00106|binding|INFO|Setting lport 1ead854c-84de-4a64-a4bf-8c9c51f98e13 ovn-installed in OVS
Oct 01 14:13:32 compute-0 nova_compute[192698]: 2025-10-01 14:13:32.470 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:32 compute-0 nova_compute[192698]: 2025-10-01 14:13:32.474 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:32 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:32.486 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[fb434397-0067-4105-a68f-eaf6d709b966]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:13:32 compute-0 systemd-machined[152704]: New machine qemu-9-instance-0000000c.
Oct 01 14:13:32 compute-0 systemd[1]: Started Virtual Machine qemu-9-instance-0000000c.
Oct 01 14:13:32 compute-0 systemd-udevd[219926]: Network interface NamePolicy= disabled on kernel command line.
Oct 01 14:13:32 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:32.528 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[b65a984f-1e0f-4ee8-b52e-4c3649e27afd]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:13:32 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:32.534 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[f9da76d8-fb9f-4dff-be98-92be0676aec3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:13:32 compute-0 NetworkManager[51741]: <info>  [1759328012.5458] device (tap1ead854c-84): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 01 14:13:32 compute-0 NetworkManager[51741]: <info>  [1759328012.5480] device (tap1ead854c-84): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 01 14:13:32 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:32.580 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[b401b0a6-89ba-4315-b7af-c75e66bc8d5c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:13:32 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:32.608 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[b8c2e348-cf8e-46c9-b281-9e61834b9d37]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8562f9c0-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f4:ed:77'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 31], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 434124, 'reachable_time': 33411, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219936, 'error': None, 'target': 'ovnmeta-8562f9c0-0a2b-4e53-975b-dd543293c802', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:13:32 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:32.632 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[c7941e4f-5926-4d81-8b54-2da4a4bfb283]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap8562f9c0-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 434138, 'tstamp': 434138}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219938, 'error': None, 'target': 'ovnmeta-8562f9c0-0a2b-4e53-975b-dd543293c802', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap8562f9c0-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 434142, 'tstamp': 434142}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219938, 'error': None, 'target': 'ovnmeta-8562f9c0-0a2b-4e53-975b-dd543293c802', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:13:32 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:32.634 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8562f9c0-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:13:32 compute-0 nova_compute[192698]: 2025-10-01 14:13:32.636 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:32 compute-0 nova_compute[192698]: 2025-10-01 14:13:32.639 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:32 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:32.640 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8562f9c0-00, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:13:32 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:32.640 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:13:32 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:32.641 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8562f9c0-00, col_values=(('external_ids', {'iface-id': 'b5ee4d88-5d32-4dfa-ae97-c0c0976243b5'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:13:32 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:32.641 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:13:32 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:32.643 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[d294f45f-d2da-4348-a40f-5c0ef92a3da8]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-8562f9c0-0a2b-4e53-975b-dd543293c802\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/8562f9c0-0a2b-4e53-975b-dd543293c802.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 8562f9c0-0a2b-4e53-975b-dd543293c802\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:13:33 compute-0 nova_compute[192698]: 2025-10-01 14:13:33.248 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:33 compute-0 nova_compute[192698]: 2025-10-01 14:13:33.280 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:35 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:35.594 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'e2:3f:3c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:1d:a6:67:ed:e6'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:13:35 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:35.594 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 01 14:13:35 compute-0 nova_compute[192698]: 2025-10-01 14:13:35.595 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:35 compute-0 ovn_controller[94909]: 2025-10-01T14:13:35Z|00107|binding|INFO|Claiming lport 1ead854c-84de-4a64-a4bf-8c9c51f98e13 for this chassis.
Oct 01 14:13:35 compute-0 ovn_controller[94909]: 2025-10-01T14:13:35Z|00108|binding|INFO|1ead854c-84de-4a64-a4bf-8c9c51f98e13: Claiming fa:16:3e:cb:92:e9 10.100.0.6
Oct 01 14:13:35 compute-0 ovn_controller[94909]: 2025-10-01T14:13:35Z|00109|binding|INFO|Setting lport 1ead854c-84de-4a64-a4bf-8c9c51f98e13 up in Southbound
Oct 01 14:13:36 compute-0 podman[219958]: 2025-10-01 14:13:36.202187617 +0000 UTC m=+0.104671113 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_id=edpm, release=1755695350)
Oct 01 14:13:36 compute-0 nova_compute[192698]: 2025-10-01 14:13:36.727 2 INFO nova.compute.manager [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 35606c68-c638-48b1-bf80-6235d971579d] Post operation of migration started
Oct 01 14:13:36 compute-0 nova_compute[192698]: 2025-10-01 14:13:36.729 2 WARNING neutronclient.v2_0.client [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:13:37 compute-0 nova_compute[192698]: 2025-10-01 14:13:37.059 2 WARNING neutronclient.v2_0.client [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:13:37 compute-0 nova_compute[192698]: 2025-10-01 14:13:37.059 2 WARNING neutronclient.v2_0.client [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:13:37 compute-0 nova_compute[192698]: 2025-10-01 14:13:37.159 2 DEBUG oslo_concurrency.lockutils [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "refresh_cache-35606c68-c638-48b1-bf80-6235d971579d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 01 14:13:37 compute-0 nova_compute[192698]: 2025-10-01 14:13:37.159 2 DEBUG oslo_concurrency.lockutils [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquired lock "refresh_cache-35606c68-c638-48b1-bf80-6235d971579d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 01 14:13:37 compute-0 nova_compute[192698]: 2025-10-01 14:13:37.160 2 DEBUG nova.network.neutron [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 35606c68-c638-48b1-bf80-6235d971579d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 01 14:13:37 compute-0 nova_compute[192698]: 2025-10-01 14:13:37.668 2 WARNING neutronclient.v2_0.client [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:13:38 compute-0 nova_compute[192698]: 2025-10-01 14:13:38.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:38 compute-0 nova_compute[192698]: 2025-10-01 14:13:38.285 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:38 compute-0 nova_compute[192698]: 2025-10-01 14:13:38.463 2 WARNING neutronclient.v2_0.client [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:13:38 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:38.599 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=10cf9814-09fa-4bad-879a-270f9b64eda3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:13:38 compute-0 nova_compute[192698]: 2025-10-01 14:13:38.650 2 DEBUG nova.network.neutron [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 35606c68-c638-48b1-bf80-6235d971579d] Updating instance_info_cache with network_info: [{"id": "1ead854c-84de-4a64-a4bf-8c9c51f98e13", "address": "fa:16:3e:cb:92:e9", "network": {"id": "8562f9c0-0a2b-4e53-975b-dd543293c802", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1048948457-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8120df3906db49b8ac8fa624e2f2aad4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ead854c-84", "ovs_interfaceid": "1ead854c-84de-4a64-a4bf-8c9c51f98e13", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:13:39 compute-0 nova_compute[192698]: 2025-10-01 14:13:39.158 2 DEBUG oslo_concurrency.lockutils [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Releasing lock "refresh_cache-35606c68-c638-48b1-bf80-6235d971579d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 01 14:13:39 compute-0 nova_compute[192698]: 2025-10-01 14:13:39.684 2 DEBUG oslo_concurrency.lockutils [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:13:39 compute-0 nova_compute[192698]: 2025-10-01 14:13:39.684 2 DEBUG oslo_concurrency.lockutils [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:13:39 compute-0 nova_compute[192698]: 2025-10-01 14:13:39.685 2 DEBUG oslo_concurrency.lockutils [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:13:39 compute-0 nova_compute[192698]: 2025-10-01 14:13:39.692 2 INFO nova.virt.libvirt.driver [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 35606c68-c638-48b1-bf80-6235d971579d] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Oct 01 14:13:39 compute-0 virtqemud[192597]: Domain id=9 name='instance-0000000c' uuid=35606c68-c638-48b1-bf80-6235d971579d is tainted: custom-monitor
Oct 01 14:13:40 compute-0 nova_compute[192698]: 2025-10-01 14:13:40.702 2 INFO nova.virt.libvirt.driver [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 35606c68-c638-48b1-bf80-6235d971579d] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Oct 01 14:13:41 compute-0 nova_compute[192698]: 2025-10-01 14:13:41.710 2 INFO nova.virt.libvirt.driver [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 35606c68-c638-48b1-bf80-6235d971579d] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Oct 01 14:13:41 compute-0 nova_compute[192698]: 2025-10-01 14:13:41.718 2 DEBUG nova.compute.manager [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 35606c68-c638-48b1-bf80-6235d971579d] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 01 14:13:42 compute-0 podman[219981]: 2025-10-01 14:13:42.196158747 +0000 UTC m=+0.100994253 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, tcib_managed=true)
Oct 01 14:13:42 compute-0 podman[219980]: 2025-10-01 14:13:42.212426888 +0000 UTC m=+0.112189727 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 01 14:13:42 compute-0 nova_compute[192698]: 2025-10-01 14:13:42.231 2 DEBUG nova.objects.instance [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 35606c68-c638-48b1-bf80-6235d971579d] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Oct 01 14:13:43 compute-0 nova_compute[192698]: 2025-10-01 14:13:43.253 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:43 compute-0 nova_compute[192698]: 2025-10-01 14:13:43.289 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:43 compute-0 nova_compute[192698]: 2025-10-01 14:13:43.297 2 WARNING neutronclient.v2_0.client [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:13:44 compute-0 nova_compute[192698]: 2025-10-01 14:13:44.161 2 WARNING neutronclient.v2_0.client [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:13:44 compute-0 nova_compute[192698]: 2025-10-01 14:13:44.162 2 WARNING neutronclient.v2_0.client [None req-ead06cd6-a48a-4cdf-81e4-b053e68dd974 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:13:47 compute-0 podman[220020]: 2025-10-01 14:13:47.182184631 +0000 UTC m=+0.090067538 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 01 14:13:48 compute-0 nova_compute[192698]: 2025-10-01 14:13:48.255 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:48 compute-0 nova_compute[192698]: 2025-10-01 14:13:48.291 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:48 compute-0 nova_compute[192698]: 2025-10-01 14:13:48.303 2 DEBUG oslo_concurrency.lockutils [None req-251351ae-e093-4ef4-9806-c90bb357cda9 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Acquiring lock "9678ec54-31c4-4d96-a3f0-96686482f8b8" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:13:48 compute-0 nova_compute[192698]: 2025-10-01 14:13:48.303 2 DEBUG oslo_concurrency.lockutils [None req-251351ae-e093-4ef4-9806-c90bb357cda9 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Lock "9678ec54-31c4-4d96-a3f0-96686482f8b8" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:13:48 compute-0 nova_compute[192698]: 2025-10-01 14:13:48.303 2 DEBUG oslo_concurrency.lockutils [None req-251351ae-e093-4ef4-9806-c90bb357cda9 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Acquiring lock "9678ec54-31c4-4d96-a3f0-96686482f8b8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:13:48 compute-0 nova_compute[192698]: 2025-10-01 14:13:48.304 2 DEBUG oslo_concurrency.lockutils [None req-251351ae-e093-4ef4-9806-c90bb357cda9 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Lock "9678ec54-31c4-4d96-a3f0-96686482f8b8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:13:48 compute-0 nova_compute[192698]: 2025-10-01 14:13:48.304 2 DEBUG oslo_concurrency.lockutils [None req-251351ae-e093-4ef4-9806-c90bb357cda9 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Lock "9678ec54-31c4-4d96-a3f0-96686482f8b8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:13:48 compute-0 nova_compute[192698]: 2025-10-01 14:13:48.318 2 INFO nova.compute.manager [None req-251351ae-e093-4ef4-9806-c90bb357cda9 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] Terminating instance
Oct 01 14:13:48 compute-0 nova_compute[192698]: 2025-10-01 14:13:48.841 2 DEBUG nova.compute.manager [None req-251351ae-e093-4ef4-9806-c90bb357cda9 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 01 14:13:48 compute-0 kernel: tap1f41468b-a3 (unregistering): left promiscuous mode
Oct 01 14:13:48 compute-0 NetworkManager[51741]: <info>  [1759328028.8702] device (tap1f41468b-a3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 01 14:13:48 compute-0 ovn_controller[94909]: 2025-10-01T14:13:48Z|00110|binding|INFO|Releasing lport 1f41468b-a36d-4ea0-bf4c-26b26778c31d from this chassis (sb_readonly=0)
Oct 01 14:13:48 compute-0 nova_compute[192698]: 2025-10-01 14:13:48.885 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:48 compute-0 ovn_controller[94909]: 2025-10-01T14:13:48Z|00111|binding|INFO|Setting lport 1f41468b-a36d-4ea0-bf4c-26b26778c31d down in Southbound
Oct 01 14:13:48 compute-0 ovn_controller[94909]: 2025-10-01T14:13:48Z|00112|binding|INFO|Removing iface tap1f41468b-a3 ovn-installed in OVS
Oct 01 14:13:48 compute-0 nova_compute[192698]: 2025-10-01 14:13:48.888 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:48 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:48.895 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:df:81:60 10.100.0.5'], port_security=['fa:16:3e:df:81:60 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '9678ec54-31c4-4d96-a3f0-96686482f8b8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8562f9c0-0a2b-4e53-975b-dd543293c802', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9f5565c36a294928af6bcd073bff4643', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'd07d6cb5-684b-4a4b-83f2-c6fbca49c797', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=18a05557-2e37-4ffc-9c62-b55a7756059d, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], logical_port=1f41468b-a36d-4ea0-bf4c-26b26778c31d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:13:48 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:48.896 103791 INFO neutron.agent.ovn.metadata.agent [-] Port 1f41468b-a36d-4ea0-bf4c-26b26778c31d in datapath 8562f9c0-0a2b-4e53-975b-dd543293c802 unbound from our chassis
Oct 01 14:13:48 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:48.897 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8562f9c0-0a2b-4e53-975b-dd543293c802
Oct 01 14:13:48 compute-0 nova_compute[192698]: 2025-10-01 14:13:48.914 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:48 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:48.924 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[26e2b311-19d2-468d-a0af-fb44971b632e]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:13:48 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000d.scope: Deactivated successfully.
Oct 01 14:13:48 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000d.scope: Consumed 15.511s CPU time.
Oct 01 14:13:48 compute-0 systemd-machined[152704]: Machine qemu-8-instance-0000000d terminated.
Oct 01 14:13:48 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:48.975 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[41475811-b5ef-4f21-bb22-164682cbde81]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:13:48 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:48.978 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[ba82fa8a-4c5c-4a88-baae-e971fa10e4d2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:13:49 compute-0 nova_compute[192698]: 2025-10-01 14:13:49.010 2 DEBUG nova.compute.manager [req-c4793110-734a-4bba-8c7f-ea33ae353d1f req-7390fa0a-88c5-483a-be77-3fe01dac16f2 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] Received event network-vif-unplugged-1f41468b-a36d-4ea0-bf4c-26b26778c31d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:13:49 compute-0 nova_compute[192698]: 2025-10-01 14:13:49.011 2 DEBUG oslo_concurrency.lockutils [req-c4793110-734a-4bba-8c7f-ea33ae353d1f req-7390fa0a-88c5-483a-be77-3fe01dac16f2 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "9678ec54-31c4-4d96-a3f0-96686482f8b8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:13:49 compute-0 nova_compute[192698]: 2025-10-01 14:13:49.011 2 DEBUG oslo_concurrency.lockutils [req-c4793110-734a-4bba-8c7f-ea33ae353d1f req-7390fa0a-88c5-483a-be77-3fe01dac16f2 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "9678ec54-31c4-4d96-a3f0-96686482f8b8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:13:49 compute-0 nova_compute[192698]: 2025-10-01 14:13:49.011 2 DEBUG oslo_concurrency.lockutils [req-c4793110-734a-4bba-8c7f-ea33ae353d1f req-7390fa0a-88c5-483a-be77-3fe01dac16f2 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "9678ec54-31c4-4d96-a3f0-96686482f8b8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:13:49 compute-0 nova_compute[192698]: 2025-10-01 14:13:49.011 2 DEBUG nova.compute.manager [req-c4793110-734a-4bba-8c7f-ea33ae353d1f req-7390fa0a-88c5-483a-be77-3fe01dac16f2 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] No waiting events found dispatching network-vif-unplugged-1f41468b-a36d-4ea0-bf4c-26b26778c31d pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:13:49 compute-0 nova_compute[192698]: 2025-10-01 14:13:49.012 2 DEBUG nova.compute.manager [req-c4793110-734a-4bba-8c7f-ea33ae353d1f req-7390fa0a-88c5-483a-be77-3fe01dac16f2 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] Received event network-vif-unplugged-1f41468b-a36d-4ea0-bf4c-26b26778c31d for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 01 14:13:49 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:49.033 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[5761e0dd-5bb2-422d-beab-e70c88b35ce6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:13:49 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:49.064 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[d6ad2778-0021-4003-976e-1217303fe199]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8562f9c0-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f4:ed:77'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 31], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 434124, 'reachable_time': 33411, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220058, 'error': None, 'target': 'ovnmeta-8562f9c0-0a2b-4e53-975b-dd543293c802', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:13:49 compute-0 nova_compute[192698]: 2025-10-01 14:13:49.073 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:49 compute-0 nova_compute[192698]: 2025-10-01 14:13:49.080 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:49 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:49.094 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[f6e2341e-e1f5-4f31-8d6e-28da6c9b9b7b]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap8562f9c0-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 434138, 'tstamp': 434138}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220062, 'error': None, 'target': 'ovnmeta-8562f9c0-0a2b-4e53-975b-dd543293c802', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap8562f9c0-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 434142, 'tstamp': 434142}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220062, 'error': None, 'target': 'ovnmeta-8562f9c0-0a2b-4e53-975b-dd543293c802', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:13:49 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:49.096 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8562f9c0-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:13:49 compute-0 nova_compute[192698]: 2025-10-01 14:13:49.097 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:49 compute-0 nova_compute[192698]: 2025-10-01 14:13:49.106 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:49 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:49.106 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8562f9c0-00, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:13:49 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:49.106 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:13:49 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:49.107 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8562f9c0-00, col_values=(('external_ids', {'iface-id': 'b5ee4d88-5d32-4dfa-ae97-c0c0976243b5'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:13:49 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:49.107 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:13:49 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:49.109 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[59bf22d5-2c2f-4713-a1ee-661017af6df6]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-8562f9c0-0a2b-4e53-975b-dd543293c802\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/8562f9c0-0a2b-4e53-975b-dd543293c802.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 8562f9c0-0a2b-4e53-975b-dd543293c802\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:13:49 compute-0 nova_compute[192698]: 2025-10-01 14:13:49.128 2 INFO nova.virt.libvirt.driver [-] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] Instance destroyed successfully.
Oct 01 14:13:49 compute-0 nova_compute[192698]: 2025-10-01 14:13:49.129 2 DEBUG nova.objects.instance [None req-251351ae-e093-4ef4-9806-c90bb357cda9 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Lazy-loading 'resources' on Instance uuid 9678ec54-31c4-4d96-a3f0-96686482f8b8 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:13:49 compute-0 nova_compute[192698]: 2025-10-01 14:13:49.636 2 DEBUG nova.virt.libvirt.vif [None req-251351ae-e093-4ef4-9806-c90bb357cda9 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-10-01T14:12:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1450345243',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1450345243',id=13,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-01T14:13:03Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9f5565c36a294928af6bcd073bff4643',ramdisk_id='',reservation_id='r-ifkg8pgx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-132658549',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-132658549-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-01T14:13:03Z,user_data=None,user_id='8e4b771b5757444093151a3e38c0b2d7',uuid=9678ec54-31c4-4d96-a3f0-96686482f8b8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1f41468b-a36d-4ea0-bf4c-26b26778c31d", "address": "fa:16:3e:df:81:60", "network": {"id": "8562f9c0-0a2b-4e53-975b-dd543293c802", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1048948457-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8120df3906db49b8ac8fa624e2f2aad4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f41468b-a3", "ovs_interfaceid": "1f41468b-a36d-4ea0-bf4c-26b26778c31d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 01 14:13:49 compute-0 nova_compute[192698]: 2025-10-01 14:13:49.637 2 DEBUG nova.network.os_vif_util [None req-251351ae-e093-4ef4-9806-c90bb357cda9 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Converting VIF {"id": "1f41468b-a36d-4ea0-bf4c-26b26778c31d", "address": "fa:16:3e:df:81:60", "network": {"id": "8562f9c0-0a2b-4e53-975b-dd543293c802", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1048948457-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8120df3906db49b8ac8fa624e2f2aad4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f41468b-a3", "ovs_interfaceid": "1f41468b-a36d-4ea0-bf4c-26b26778c31d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:13:49 compute-0 nova_compute[192698]: 2025-10-01 14:13:49.637 2 DEBUG nova.network.os_vif_util [None req-251351ae-e093-4ef4-9806-c90bb357cda9 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:df:81:60,bridge_name='br-int',has_traffic_filtering=True,id=1f41468b-a36d-4ea0-bf4c-26b26778c31d,network=Network(8562f9c0-0a2b-4e53-975b-dd543293c802),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f41468b-a3') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:13:49 compute-0 nova_compute[192698]: 2025-10-01 14:13:49.638 2 DEBUG os_vif [None req-251351ae-e093-4ef4-9806-c90bb357cda9 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:df:81:60,bridge_name='br-int',has_traffic_filtering=True,id=1f41468b-a36d-4ea0-bf4c-26b26778c31d,network=Network(8562f9c0-0a2b-4e53-975b-dd543293c802),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f41468b-a3') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 01 14:13:49 compute-0 nova_compute[192698]: 2025-10-01 14:13:49.640 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:49 compute-0 nova_compute[192698]: 2025-10-01 14:13:49.640 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1f41468b-a3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:13:49 compute-0 nova_compute[192698]: 2025-10-01 14:13:49.642 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:49 compute-0 nova_compute[192698]: 2025-10-01 14:13:49.643 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:13:49 compute-0 nova_compute[192698]: 2025-10-01 14:13:49.644 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:49 compute-0 nova_compute[192698]: 2025-10-01 14:13:49.644 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:49 compute-0 nova_compute[192698]: 2025-10-01 14:13:49.645 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=3c174cb2-350c-417f-a5cf-a196aafff863) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:13:49 compute-0 nova_compute[192698]: 2025-10-01 14:13:49.645 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:49 compute-0 nova_compute[192698]: 2025-10-01 14:13:49.646 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:49 compute-0 nova_compute[192698]: 2025-10-01 14:13:49.649 2 INFO os_vif [None req-251351ae-e093-4ef4-9806-c90bb357cda9 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:df:81:60,bridge_name='br-int',has_traffic_filtering=True,id=1f41468b-a36d-4ea0-bf4c-26b26778c31d,network=Network(8562f9c0-0a2b-4e53-975b-dd543293c802),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f41468b-a3')
Oct 01 14:13:49 compute-0 nova_compute[192698]: 2025-10-01 14:13:49.649 2 INFO nova.virt.libvirt.driver [None req-251351ae-e093-4ef4-9806-c90bb357cda9 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] Deleting instance files /var/lib/nova/instances/9678ec54-31c4-4d96-a3f0-96686482f8b8_del
Oct 01 14:13:49 compute-0 nova_compute[192698]: 2025-10-01 14:13:49.650 2 INFO nova.virt.libvirt.driver [None req-251351ae-e093-4ef4-9806-c90bb357cda9 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] Deletion of /var/lib/nova/instances/9678ec54-31c4-4d96-a3f0-96686482f8b8_del complete
Oct 01 14:13:50 compute-0 nova_compute[192698]: 2025-10-01 14:13:50.164 2 INFO nova.compute.manager [None req-251351ae-e093-4ef4-9806-c90bb357cda9 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] Took 1.32 seconds to destroy the instance on the hypervisor.
Oct 01 14:13:50 compute-0 nova_compute[192698]: 2025-10-01 14:13:50.165 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-251351ae-e093-4ef4-9806-c90bb357cda9 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 01 14:13:50 compute-0 nova_compute[192698]: 2025-10-01 14:13:50.165 2 DEBUG nova.compute.manager [-] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 01 14:13:50 compute-0 nova_compute[192698]: 2025-10-01 14:13:50.165 2 DEBUG nova.network.neutron [-] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 01 14:13:50 compute-0 nova_compute[192698]: 2025-10-01 14:13:50.166 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:13:51 compute-0 nova_compute[192698]: 2025-10-01 14:13:51.068 2 DEBUG nova.compute.manager [req-d55e5f2c-1d88-4177-96c0-93239c85872e req-4920d02a-d885-4236-ae0a-63b8de8fc692 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] Received event network-vif-unplugged-1f41468b-a36d-4ea0-bf4c-26b26778c31d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:13:51 compute-0 nova_compute[192698]: 2025-10-01 14:13:51.068 2 DEBUG oslo_concurrency.lockutils [req-d55e5f2c-1d88-4177-96c0-93239c85872e req-4920d02a-d885-4236-ae0a-63b8de8fc692 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "9678ec54-31c4-4d96-a3f0-96686482f8b8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:13:51 compute-0 nova_compute[192698]: 2025-10-01 14:13:51.069 2 DEBUG oslo_concurrency.lockutils [req-d55e5f2c-1d88-4177-96c0-93239c85872e req-4920d02a-d885-4236-ae0a-63b8de8fc692 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "9678ec54-31c4-4d96-a3f0-96686482f8b8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:13:51 compute-0 nova_compute[192698]: 2025-10-01 14:13:51.069 2 DEBUG oslo_concurrency.lockutils [req-d55e5f2c-1d88-4177-96c0-93239c85872e req-4920d02a-d885-4236-ae0a-63b8de8fc692 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "9678ec54-31c4-4d96-a3f0-96686482f8b8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:13:51 compute-0 nova_compute[192698]: 2025-10-01 14:13:51.070 2 DEBUG nova.compute.manager [req-d55e5f2c-1d88-4177-96c0-93239c85872e req-4920d02a-d885-4236-ae0a-63b8de8fc692 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] No waiting events found dispatching network-vif-unplugged-1f41468b-a36d-4ea0-bf4c-26b26778c31d pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:13:51 compute-0 nova_compute[192698]: 2025-10-01 14:13:51.070 2 DEBUG nova.compute.manager [req-d55e5f2c-1d88-4177-96c0-93239c85872e req-4920d02a-d885-4236-ae0a-63b8de8fc692 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] Received event network-vif-unplugged-1f41468b-a36d-4ea0-bf4c-26b26778c31d for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 01 14:13:51 compute-0 unix_chkpwd[220079]: password check failed for user (sshd)
Oct 01 14:13:51 compute-0 sshd-session[220077]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.95.116  user=sshd
Oct 01 14:13:51 compute-0 nova_compute[192698]: 2025-10-01 14:13:51.155 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:13:52 compute-0 nova_compute[192698]: 2025-10-01 14:13:52.295 2 DEBUG nova.compute.manager [req-c83a3095-6fe1-4568-b735-b488824d1ac7 req-75983ddd-34f9-4a0f-a831-0d0ca9589b9f 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] Received event network-vif-deleted-1f41468b-a36d-4ea0-bf4c-26b26778c31d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:13:52 compute-0 nova_compute[192698]: 2025-10-01 14:13:52.295 2 INFO nova.compute.manager [req-c83a3095-6fe1-4568-b735-b488824d1ac7 req-75983ddd-34f9-4a0f-a831-0d0ca9589b9f 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] Neutron deleted interface 1f41468b-a36d-4ea0-bf4c-26b26778c31d; detaching it from the instance and deleting it from the info cache
Oct 01 14:13:52 compute-0 nova_compute[192698]: 2025-10-01 14:13:52.296 2 DEBUG nova.network.neutron [req-c83a3095-6fe1-4568-b735-b488824d1ac7 req-75983ddd-34f9-4a0f-a831-0d0ca9589b9f 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:13:52 compute-0 nova_compute[192698]: 2025-10-01 14:13:52.708 2 DEBUG nova.network.neutron [-] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:13:52 compute-0 nova_compute[192698]: 2025-10-01 14:13:52.809 2 DEBUG nova.compute.manager [req-c83a3095-6fe1-4568-b735-b488824d1ac7 req-75983ddd-34f9-4a0f-a831-0d0ca9589b9f 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] Detach interface failed, port_id=1f41468b-a36d-4ea0-bf4c-26b26778c31d, reason: Instance 9678ec54-31c4-4d96-a3f0-96686482f8b8 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Oct 01 14:13:53 compute-0 nova_compute[192698]: 2025-10-01 14:13:53.215 2 INFO nova.compute.manager [-] [instance: 9678ec54-31c4-4d96-a3f0-96686482f8b8] Took 3.05 seconds to deallocate network for instance.
Oct 01 14:13:53 compute-0 nova_compute[192698]: 2025-10-01 14:13:53.296 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:53 compute-0 sshd-session[220077]: Failed password for sshd from 80.94.95.116 port 62264 ssh2
Oct 01 14:13:53 compute-0 nova_compute[192698]: 2025-10-01 14:13:53.742 2 DEBUG oslo_concurrency.lockutils [None req-251351ae-e093-4ef4-9806-c90bb357cda9 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:13:53 compute-0 nova_compute[192698]: 2025-10-01 14:13:53.743 2 DEBUG oslo_concurrency.lockutils [None req-251351ae-e093-4ef4-9806-c90bb357cda9 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:13:53 compute-0 nova_compute[192698]: 2025-10-01 14:13:53.817 2 DEBUG nova.compute.provider_tree [None req-251351ae-e093-4ef4-9806-c90bb357cda9 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:13:54 compute-0 nova_compute[192698]: 2025-10-01 14:13:54.327 2 DEBUG nova.scheduler.client.report [None req-251351ae-e093-4ef4-9806-c90bb357cda9 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:13:54 compute-0 nova_compute[192698]: 2025-10-01 14:13:54.692 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:54 compute-0 nova_compute[192698]: 2025-10-01 14:13:54.840 2 DEBUG oslo_concurrency.lockutils [None req-251351ae-e093-4ef4-9806-c90bb357cda9 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.097s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:13:54 compute-0 nova_compute[192698]: 2025-10-01 14:13:54.860 2 INFO nova.scheduler.client.report [None req-251351ae-e093-4ef4-9806-c90bb357cda9 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Deleted allocations for instance 9678ec54-31c4-4d96-a3f0-96686482f8b8
Oct 01 14:13:55 compute-0 sshd-session[220077]: Connection closed by authenticating user sshd 80.94.95.116 port 62264 [preauth]
Oct 01 14:13:55 compute-0 nova_compute[192698]: 2025-10-01 14:13:55.894 2 DEBUG oslo_concurrency.lockutils [None req-251351ae-e093-4ef4-9806-c90bb357cda9 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Lock "9678ec54-31c4-4d96-a3f0-96686482f8b8" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.591s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:13:56 compute-0 nova_compute[192698]: 2025-10-01 14:13:56.611 2 DEBUG oslo_concurrency.lockutils [None req-f2e61774-1963-4744-8bfd-1acd55481ce6 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Acquiring lock "35606c68-c638-48b1-bf80-6235d971579d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:13:56 compute-0 nova_compute[192698]: 2025-10-01 14:13:56.612 2 DEBUG oslo_concurrency.lockutils [None req-f2e61774-1963-4744-8bfd-1acd55481ce6 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Lock "35606c68-c638-48b1-bf80-6235d971579d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:13:56 compute-0 nova_compute[192698]: 2025-10-01 14:13:56.612 2 DEBUG oslo_concurrency.lockutils [None req-f2e61774-1963-4744-8bfd-1acd55481ce6 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Acquiring lock "35606c68-c638-48b1-bf80-6235d971579d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:13:56 compute-0 nova_compute[192698]: 2025-10-01 14:13:56.612 2 DEBUG oslo_concurrency.lockutils [None req-f2e61774-1963-4744-8bfd-1acd55481ce6 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Lock "35606c68-c638-48b1-bf80-6235d971579d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:13:56 compute-0 nova_compute[192698]: 2025-10-01 14:13:56.612 2 DEBUG oslo_concurrency.lockutils [None req-f2e61774-1963-4744-8bfd-1acd55481ce6 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Lock "35606c68-c638-48b1-bf80-6235d971579d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:13:56 compute-0 nova_compute[192698]: 2025-10-01 14:13:56.623 2 INFO nova.compute.manager [None req-f2e61774-1963-4744-8bfd-1acd55481ce6 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 35606c68-c638-48b1-bf80-6235d971579d] Terminating instance
Oct 01 14:13:57 compute-0 nova_compute[192698]: 2025-10-01 14:13:57.139 2 DEBUG nova.compute.manager [None req-f2e61774-1963-4744-8bfd-1acd55481ce6 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 35606c68-c638-48b1-bf80-6235d971579d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 01 14:13:57 compute-0 kernel: tap1ead854c-84 (unregistering): left promiscuous mode
Oct 01 14:13:57 compute-0 NetworkManager[51741]: <info>  [1759328037.1674] device (tap1ead854c-84): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 01 14:13:57 compute-0 nova_compute[192698]: 2025-10-01 14:13:57.179 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:57 compute-0 ovn_controller[94909]: 2025-10-01T14:13:57Z|00113|binding|INFO|Releasing lport 1ead854c-84de-4a64-a4bf-8c9c51f98e13 from this chassis (sb_readonly=0)
Oct 01 14:13:57 compute-0 ovn_controller[94909]: 2025-10-01T14:13:57Z|00114|binding|INFO|Setting lport 1ead854c-84de-4a64-a4bf-8c9c51f98e13 down in Southbound
Oct 01 14:13:57 compute-0 ovn_controller[94909]: 2025-10-01T14:13:57Z|00115|binding|INFO|Removing iface tap1ead854c-84 ovn-installed in OVS
Oct 01 14:13:57 compute-0 nova_compute[192698]: 2025-10-01 14:13:57.181 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:57 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:57.191 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cb:92:e9 10.100.0.6'], port_security=['fa:16:3e:cb:92:e9 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '35606c68-c638-48b1-bf80-6235d971579d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8562f9c0-0a2b-4e53-975b-dd543293c802', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9f5565c36a294928af6bcd073bff4643', 'neutron:revision_number': '15', 'neutron:security_group_ids': 'd07d6cb5-684b-4a4b-83f2-c6fbca49c797', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=18a05557-2e37-4ffc-9c62-b55a7756059d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], logical_port=1ead854c-84de-4a64-a4bf-8c9c51f98e13) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:13:57 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:57.191 103791 INFO neutron.agent.ovn.metadata.agent [-] Port 1ead854c-84de-4a64-a4bf-8c9c51f98e13 in datapath 8562f9c0-0a2b-4e53-975b-dd543293c802 unbound from our chassis
Oct 01 14:13:57 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:57.192 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8562f9c0-0a2b-4e53-975b-dd543293c802, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 01 14:13:57 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:57.193 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[bd1f6488-12e5-46f5-a633-932b128e803a]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:13:57 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:57.194 103791 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8562f9c0-0a2b-4e53-975b-dd543293c802 namespace which is not needed anymore
Oct 01 14:13:57 compute-0 nova_compute[192698]: 2025-10-01 14:13:57.211 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:57 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Oct 01 14:13:57 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000000c.scope: Consumed 3.245s CPU time.
Oct 01 14:13:57 compute-0 systemd-machined[152704]: Machine qemu-9-instance-0000000c terminated.
Oct 01 14:13:57 compute-0 neutron-haproxy-ovnmeta-8562f9c0-0a2b-4e53-975b-dd543293c802[219693]: [NOTICE]   (219697) : haproxy version is 3.0.5-8e879a5
Oct 01 14:13:57 compute-0 neutron-haproxy-ovnmeta-8562f9c0-0a2b-4e53-975b-dd543293c802[219693]: [NOTICE]   (219697) : path to executable is /usr/sbin/haproxy
Oct 01 14:13:57 compute-0 neutron-haproxy-ovnmeta-8562f9c0-0a2b-4e53-975b-dd543293c802[219693]: [WARNING]  (219697) : Exiting Master process...
Oct 01 14:13:57 compute-0 podman[220108]: 2025-10-01 14:13:57.344122485 +0000 UTC m=+0.038317568 container kill d3085196b551add75580ee148a204cb396373e4944aa95bb931ae257ed1e9888 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-8562f9c0-0a2b-4e53-975b-dd543293c802, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930)
Oct 01 14:13:57 compute-0 neutron-haproxy-ovnmeta-8562f9c0-0a2b-4e53-975b-dd543293c802[219693]: [ALERT]    (219697) : Current worker (219699) exited with code 143 (Terminated)
Oct 01 14:13:57 compute-0 neutron-haproxy-ovnmeta-8562f9c0-0a2b-4e53-975b-dd543293c802[219693]: [WARNING]  (219697) : All workers exited. Exiting... (0)
Oct 01 14:13:57 compute-0 systemd[1]: libpod-d3085196b551add75580ee148a204cb396373e4944aa95bb931ae257ed1e9888.scope: Deactivated successfully.
Oct 01 14:13:57 compute-0 nova_compute[192698]: 2025-10-01 14:13:57.372 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:57 compute-0 nova_compute[192698]: 2025-10-01 14:13:57.378 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:57 compute-0 podman[220128]: 2025-10-01 14:13:57.410422768 +0000 UTC m=+0.031380800 container died d3085196b551add75580ee148a204cb396373e4944aa95bb931ae257ed1e9888 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-8562f9c0-0a2b-4e53-975b-dd543293c802, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 01 14:13:57 compute-0 nova_compute[192698]: 2025-10-01 14:13:57.425 2 INFO nova.virt.libvirt.driver [-] [instance: 35606c68-c638-48b1-bf80-6235d971579d] Instance destroyed successfully.
Oct 01 14:13:57 compute-0 nova_compute[192698]: 2025-10-01 14:13:57.426 2 DEBUG nova.objects.instance [None req-f2e61774-1963-4744-8bfd-1acd55481ce6 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Lazy-loading 'resources' on Instance uuid 35606c68-c638-48b1-bf80-6235d971579d obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:13:57 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d3085196b551add75580ee148a204cb396373e4944aa95bb931ae257ed1e9888-userdata-shm.mount: Deactivated successfully.
Oct 01 14:13:57 compute-0 systemd[1]: var-lib-containers-storage-overlay-112fa9c377ba64626677f973a8fb3ddbb7886222e840d5840ce7fe7bc22c3a02-merged.mount: Deactivated successfully.
Oct 01 14:13:57 compute-0 podman[220128]: 2025-10-01 14:13:57.468008476 +0000 UTC m=+0.088966478 container remove d3085196b551add75580ee148a204cb396373e4944aa95bb931ae257ed1e9888 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-8562f9c0-0a2b-4e53-975b-dd543293c802, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct 01 14:13:57 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:57.475 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[216e2304-ec6d-456b-bf0e-49e855384a45]: (4, ("Wed Oct  1 02:13:57 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-8562f9c0-0a2b-4e53-975b-dd543293c802 (d3085196b551add75580ee148a204cb396373e4944aa95bb931ae257ed1e9888)\nd3085196b551add75580ee148a204cb396373e4944aa95bb931ae257ed1e9888\nWed Oct  1 02:13:57 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-8562f9c0-0a2b-4e53-975b-dd543293c802 (d3085196b551add75580ee148a204cb396373e4944aa95bb931ae257ed1e9888)\nd3085196b551add75580ee148a204cb396373e4944aa95bb931ae257ed1e9888\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:13:57 compute-0 systemd[1]: libpod-conmon-d3085196b551add75580ee148a204cb396373e4944aa95bb931ae257ed1e9888.scope: Deactivated successfully.
Oct 01 14:13:57 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:57.477 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[532ebb64-bc59-4b65-8526-cfe11ce2b3ca]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:13:57 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:57.478 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8562f9c0-0a2b-4e53-975b-dd543293c802.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8562f9c0-0a2b-4e53-975b-dd543293c802.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:13:57 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:57.478 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[c8c17bda-864e-4456-8e9f-8e4e8c7b5b47]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:13:57 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:57.480 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8562f9c0-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:13:57 compute-0 nova_compute[192698]: 2025-10-01 14:13:57.482 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:57 compute-0 kernel: tap8562f9c0-00: left promiscuous mode
Oct 01 14:13:57 compute-0 nova_compute[192698]: 2025-10-01 14:13:57.505 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:57 compute-0 nova_compute[192698]: 2025-10-01 14:13:57.506 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:57 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:57.510 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[44690c60-b9af-45e0-96da-7b0d75baac0e]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:13:57 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:57.557 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[6ec40ec8-9928-4919-a69b-fbadf6198532]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:13:57 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:57.559 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[85e8e814-f139-4261-b8a5-cfbfee7d0532]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:13:57 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:57.585 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[f4b7b513-14c0-4b86-ae53-add41d3c895d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 434113, 'reachable_time': 30509, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220176, 'error': None, 'target': 'ovnmeta-8562f9c0-0a2b-4e53-975b-dd543293c802', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:13:57 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:57.588 103910 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8562f9c0-0a2b-4e53-975b-dd543293c802 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Oct 01 14:13:57 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:13:57.588 103910 DEBUG oslo.privsep.daemon [-] privsep: reply[9b6a5049-f019-4896-aca4-8e845ca0ecf6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:13:57 compute-0 systemd[1]: run-netns-ovnmeta\x2d8562f9c0\x2d0a2b\x2d4e53\x2d975b\x2ddd543293c802.mount: Deactivated successfully.
Oct 01 14:13:57 compute-0 nova_compute[192698]: 2025-10-01 14:13:57.934 2 DEBUG nova.virt.libvirt.vif [None req-f2e61774-1963-4744-8bfd-1acd55481ce6 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-10-01T14:12:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-736323948',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-736323948',id=12,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-01T14:12:40Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9f5565c36a294928af6bcd073bff4643',ramdisk_id='',reservation_id='r-4dxkp51y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',clean_attempts='1',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-132658549',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-132658549-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-01T14:13:42Z,user_data=None,user_id='8e4b771b5757444093151a3e38c0b2d7',uuid=35606c68-c638-48b1-bf80-6235d971579d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1ead854c-84de-4a64-a4bf-8c9c51f98e13", "address": "fa:16:3e:cb:92:e9", "network": {"id": "8562f9c0-0a2b-4e53-975b-dd543293c802", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1048948457-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8120df3906db49b8ac8fa624e2f2aad4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ead854c-84", "ovs_interfaceid": "1ead854c-84de-4a64-a4bf-8c9c51f98e13", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 01 14:13:57 compute-0 nova_compute[192698]: 2025-10-01 14:13:57.935 2 DEBUG nova.network.os_vif_util [None req-f2e61774-1963-4744-8bfd-1acd55481ce6 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Converting VIF {"id": "1ead854c-84de-4a64-a4bf-8c9c51f98e13", "address": "fa:16:3e:cb:92:e9", "network": {"id": "8562f9c0-0a2b-4e53-975b-dd543293c802", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1048948457-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8120df3906db49b8ac8fa624e2f2aad4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1ead854c-84", "ovs_interfaceid": "1ead854c-84de-4a64-a4bf-8c9c51f98e13", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:13:57 compute-0 nova_compute[192698]: 2025-10-01 14:13:57.936 2 DEBUG nova.network.os_vif_util [None req-f2e61774-1963-4744-8bfd-1acd55481ce6 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:cb:92:e9,bridge_name='br-int',has_traffic_filtering=True,id=1ead854c-84de-4a64-a4bf-8c9c51f98e13,network=Network(8562f9c0-0a2b-4e53-975b-dd543293c802),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1ead854c-84') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:13:57 compute-0 nova_compute[192698]: 2025-10-01 14:13:57.936 2 DEBUG os_vif [None req-f2e61774-1963-4744-8bfd-1acd55481ce6 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:cb:92:e9,bridge_name='br-int',has_traffic_filtering=True,id=1ead854c-84de-4a64-a4bf-8c9c51f98e13,network=Network(8562f9c0-0a2b-4e53-975b-dd543293c802),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1ead854c-84') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 01 14:13:57 compute-0 nova_compute[192698]: 2025-10-01 14:13:57.939 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:57 compute-0 nova_compute[192698]: 2025-10-01 14:13:57.939 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1ead854c-84, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:13:57 compute-0 nova_compute[192698]: 2025-10-01 14:13:57.941 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:57 compute-0 nova_compute[192698]: 2025-10-01 14:13:57.943 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:13:57 compute-0 nova_compute[192698]: 2025-10-01 14:13:57.944 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:57 compute-0 nova_compute[192698]: 2025-10-01 14:13:57.944 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=15a6322d-ace5-4af3-8ecd-39cb3284e986) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:13:57 compute-0 nova_compute[192698]: 2025-10-01 14:13:57.945 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:57 compute-0 nova_compute[192698]: 2025-10-01 14:13:57.946 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:57 compute-0 nova_compute[192698]: 2025-10-01 14:13:57.949 2 INFO os_vif [None req-f2e61774-1963-4744-8bfd-1acd55481ce6 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:cb:92:e9,bridge_name='br-int',has_traffic_filtering=True,id=1ead854c-84de-4a64-a4bf-8c9c51f98e13,network=Network(8562f9c0-0a2b-4e53-975b-dd543293c802),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1ead854c-84')
Oct 01 14:13:57 compute-0 nova_compute[192698]: 2025-10-01 14:13:57.950 2 INFO nova.virt.libvirt.driver [None req-f2e61774-1963-4744-8bfd-1acd55481ce6 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 35606c68-c638-48b1-bf80-6235d971579d] Deleting instance files /var/lib/nova/instances/35606c68-c638-48b1-bf80-6235d971579d_del
Oct 01 14:13:57 compute-0 nova_compute[192698]: 2025-10-01 14:13:57.950 2 INFO nova.virt.libvirt.driver [None req-f2e61774-1963-4744-8bfd-1acd55481ce6 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 35606c68-c638-48b1-bf80-6235d971579d] Deletion of /var/lib/nova/instances/35606c68-c638-48b1-bf80-6235d971579d_del complete
Oct 01 14:13:58 compute-0 nova_compute[192698]: 2025-10-01 14:13:58.203 2 DEBUG nova.compute.manager [req-8600b801-dd63-4888-95fe-a304fd26dee5 req-624c2f41-8dac-4852-b283-64a39e26874b 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 35606c68-c638-48b1-bf80-6235d971579d] Received event network-vif-unplugged-1ead854c-84de-4a64-a4bf-8c9c51f98e13 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:13:58 compute-0 nova_compute[192698]: 2025-10-01 14:13:58.204 2 DEBUG oslo_concurrency.lockutils [req-8600b801-dd63-4888-95fe-a304fd26dee5 req-624c2f41-8dac-4852-b283-64a39e26874b 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "35606c68-c638-48b1-bf80-6235d971579d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:13:58 compute-0 nova_compute[192698]: 2025-10-01 14:13:58.204 2 DEBUG oslo_concurrency.lockutils [req-8600b801-dd63-4888-95fe-a304fd26dee5 req-624c2f41-8dac-4852-b283-64a39e26874b 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "35606c68-c638-48b1-bf80-6235d971579d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:13:58 compute-0 nova_compute[192698]: 2025-10-01 14:13:58.205 2 DEBUG oslo_concurrency.lockutils [req-8600b801-dd63-4888-95fe-a304fd26dee5 req-624c2f41-8dac-4852-b283-64a39e26874b 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "35606c68-c638-48b1-bf80-6235d971579d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:13:58 compute-0 nova_compute[192698]: 2025-10-01 14:13:58.205 2 DEBUG nova.compute.manager [req-8600b801-dd63-4888-95fe-a304fd26dee5 req-624c2f41-8dac-4852-b283-64a39e26874b 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 35606c68-c638-48b1-bf80-6235d971579d] No waiting events found dispatching network-vif-unplugged-1ead854c-84de-4a64-a4bf-8c9c51f98e13 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:13:58 compute-0 nova_compute[192698]: 2025-10-01 14:13:58.205 2 DEBUG nova.compute.manager [req-8600b801-dd63-4888-95fe-a304fd26dee5 req-624c2f41-8dac-4852-b283-64a39e26874b 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 35606c68-c638-48b1-bf80-6235d971579d] Received event network-vif-unplugged-1ead854c-84de-4a64-a4bf-8c9c51f98e13 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 01 14:13:58 compute-0 nova_compute[192698]: 2025-10-01 14:13:58.338 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:13:58 compute-0 nova_compute[192698]: 2025-10-01 14:13:58.466 2 INFO nova.compute.manager [None req-f2e61774-1963-4744-8bfd-1acd55481ce6 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 35606c68-c638-48b1-bf80-6235d971579d] Took 1.33 seconds to destroy the instance on the hypervisor.
Oct 01 14:13:58 compute-0 nova_compute[192698]: 2025-10-01 14:13:58.466 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-f2e61774-1963-4744-8bfd-1acd55481ce6 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 01 14:13:58 compute-0 nova_compute[192698]: 2025-10-01 14:13:58.467 2 DEBUG nova.compute.manager [-] [instance: 35606c68-c638-48b1-bf80-6235d971579d] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 01 14:13:58 compute-0 nova_compute[192698]: 2025-10-01 14:13:58.467 2 DEBUG nova.network.neutron [-] [instance: 35606c68-c638-48b1-bf80-6235d971579d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 01 14:13:58 compute-0 nova_compute[192698]: 2025-10-01 14:13:58.468 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:13:59 compute-0 nova_compute[192698]: 2025-10-01 14:13:59.162 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:13:59 compute-0 podman[220177]: 2025-10-01 14:13:59.178999996 +0000 UTC m=+0.086227434 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Oct 01 14:13:59 compute-0 podman[220178]: 2025-10-01 14:13:59.223542341 +0000 UTC m=+0.128738384 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Oct 01 14:13:59 compute-0 podman[203144]: time="2025-10-01T14:13:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:13:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:13:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:13:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:13:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3022 "" "Go-http-client/1.1"
Oct 01 14:14:00 compute-0 nova_compute[192698]: 2025-10-01 14:14:00.257 2 DEBUG nova.compute.manager [req-ac55a1e5-a9e4-425f-8ea1-0799300b4aa4 req-1725c716-21f9-465a-9b55-1d27ba854617 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 35606c68-c638-48b1-bf80-6235d971579d] Received event network-vif-unplugged-1ead854c-84de-4a64-a4bf-8c9c51f98e13 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:14:00 compute-0 nova_compute[192698]: 2025-10-01 14:14:00.258 2 DEBUG oslo_concurrency.lockutils [req-ac55a1e5-a9e4-425f-8ea1-0799300b4aa4 req-1725c716-21f9-465a-9b55-1d27ba854617 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "35606c68-c638-48b1-bf80-6235d971579d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:14:00 compute-0 nova_compute[192698]: 2025-10-01 14:14:00.258 2 DEBUG oslo_concurrency.lockutils [req-ac55a1e5-a9e4-425f-8ea1-0799300b4aa4 req-1725c716-21f9-465a-9b55-1d27ba854617 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "35606c68-c638-48b1-bf80-6235d971579d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:14:00 compute-0 nova_compute[192698]: 2025-10-01 14:14:00.258 2 DEBUG oslo_concurrency.lockutils [req-ac55a1e5-a9e4-425f-8ea1-0799300b4aa4 req-1725c716-21f9-465a-9b55-1d27ba854617 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "35606c68-c638-48b1-bf80-6235d971579d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:14:00 compute-0 nova_compute[192698]: 2025-10-01 14:14:00.259 2 DEBUG nova.compute.manager [req-ac55a1e5-a9e4-425f-8ea1-0799300b4aa4 req-1725c716-21f9-465a-9b55-1d27ba854617 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 35606c68-c638-48b1-bf80-6235d971579d] No waiting events found dispatching network-vif-unplugged-1ead854c-84de-4a64-a4bf-8c9c51f98e13 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:14:00 compute-0 nova_compute[192698]: 2025-10-01 14:14:00.259 2 DEBUG nova.compute.manager [req-ac55a1e5-a9e4-425f-8ea1-0799300b4aa4 req-1725c716-21f9-465a-9b55-1d27ba854617 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 35606c68-c638-48b1-bf80-6235d971579d] Received event network-vif-unplugged-1ead854c-84de-4a64-a4bf-8c9c51f98e13 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 01 14:14:01 compute-0 openstack_network_exporter[205307]: ERROR   14:14:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:14:01 compute-0 openstack_network_exporter[205307]: ERROR   14:14:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:14:01 compute-0 openstack_network_exporter[205307]: ERROR   14:14:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:14:01 compute-0 openstack_network_exporter[205307]: ERROR   14:14:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:14:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:14:01 compute-0 openstack_network_exporter[205307]: ERROR   14:14:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:14:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:14:01 compute-0 nova_compute[192698]: 2025-10-01 14:14:01.648 2 DEBUG nova.network.neutron [-] [instance: 35606c68-c638-48b1-bf80-6235d971579d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:14:02 compute-0 nova_compute[192698]: 2025-10-01 14:14:02.155 2 INFO nova.compute.manager [-] [instance: 35606c68-c638-48b1-bf80-6235d971579d] Took 3.69 seconds to deallocate network for instance.
Oct 01 14:14:02 compute-0 nova_compute[192698]: 2025-10-01 14:14:02.348 2 DEBUG nova.compute.manager [req-d8638c6d-468a-4c59-a4d9-898c19e6297c req-bed51f4a-29ef-43ec-b5ff-0ac56492abfb 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 35606c68-c638-48b1-bf80-6235d971579d] Received event network-vif-deleted-1ead854c-84de-4a64-a4bf-8c9c51f98e13 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:14:02 compute-0 nova_compute[192698]: 2025-10-01 14:14:02.735 2 DEBUG oslo_concurrency.lockutils [None req-f2e61774-1963-4744-8bfd-1acd55481ce6 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:14:02 compute-0 nova_compute[192698]: 2025-10-01 14:14:02.736 2 DEBUG oslo_concurrency.lockutils [None req-f2e61774-1963-4744-8bfd-1acd55481ce6 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:14:02 compute-0 nova_compute[192698]: 2025-10-01 14:14:02.743 2 DEBUG oslo_concurrency.lockutils [None req-f2e61774-1963-4744-8bfd-1acd55481ce6 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.006s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:14:02 compute-0 nova_compute[192698]: 2025-10-01 14:14:02.794 2 INFO nova.scheduler.client.report [None req-f2e61774-1963-4744-8bfd-1acd55481ce6 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Deleted allocations for instance 35606c68-c638-48b1-bf80-6235d971579d
Oct 01 14:14:02 compute-0 nova_compute[192698]: 2025-10-01 14:14:02.946 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:14:03 compute-0 nova_compute[192698]: 2025-10-01 14:14:03.339 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:14:03 compute-0 nova_compute[192698]: 2025-10-01 14:14:03.843 2 DEBUG oslo_concurrency.lockutils [None req-f2e61774-1963-4744-8bfd-1acd55481ce6 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Lock "35606c68-c638-48b1-bf80-6235d971579d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.231s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:14:07 compute-0 podman[220222]: 2025-10-01 14:14:07.185150237 +0000 UTC m=+0.092042592 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git, architecture=x86_64, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct 01 14:14:07 compute-0 nova_compute[192698]: 2025-10-01 14:14:07.948 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:14:08 compute-0 nova_compute[192698]: 2025-10-01 14:14:08.342 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:14:12 compute-0 nova_compute[192698]: 2025-10-01 14:14:12.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:14:13 compute-0 podman[220245]: 2025-10-01 14:14:13.139723032 +0000 UTC m=+0.062321697 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.build-date=20250930)
Oct 01 14:14:13 compute-0 podman[220246]: 2025-10-01 14:14:13.168786179 +0000 UTC m=+0.078914267 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 01 14:14:13 compute-0 nova_compute[192698]: 2025-10-01 14:14:13.344 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:14:13 compute-0 nova_compute[192698]: 2025-10-01 14:14:13.435 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:14:13 compute-0 nova_compute[192698]: 2025-10-01 14:14:13.952 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:14:13 compute-0 nova_compute[192698]: 2025-10-01 14:14:13.953 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:14:13 compute-0 nova_compute[192698]: 2025-10-01 14:14:13.953 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:14:13 compute-0 nova_compute[192698]: 2025-10-01 14:14:13.953 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 01 14:14:14 compute-0 nova_compute[192698]: 2025-10-01 14:14:14.166 2 WARNING nova.virt.libvirt.driver [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:14:14 compute-0 nova_compute[192698]: 2025-10-01 14:14:14.168 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:14:14 compute-0 nova_compute[192698]: 2025-10-01 14:14:14.191 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.023s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:14:14 compute-0 nova_compute[192698]: 2025-10-01 14:14:14.192 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5859MB free_disk=73.30145645141602GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 01 14:14:14 compute-0 nova_compute[192698]: 2025-10-01 14:14:14.192 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:14:14 compute-0 nova_compute[192698]: 2025-10-01 14:14:14.193 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:14:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:14:14.257 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:14:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:14:14.258 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:14:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:14:14.258 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:14:15 compute-0 nova_compute[192698]: 2025-10-01 14:14:15.298 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 01 14:14:15 compute-0 nova_compute[192698]: 2025-10-01 14:14:15.299 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:14:14 up  1:13,  0 user,  load average: 0.18, 0.22, 0.37\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 01 14:14:15 compute-0 nova_compute[192698]: 2025-10-01 14:14:15.355 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:14:15 compute-0 nova_compute[192698]: 2025-10-01 14:14:15.864 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:14:16 compute-0 nova_compute[192698]: 2025-10-01 14:14:16.373 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 01 14:14:16 compute-0 nova_compute[192698]: 2025-10-01 14:14:16.373 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.181s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:14:16 compute-0 nova_compute[192698]: 2025-10-01 14:14:16.863 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:14:16 compute-0 nova_compute[192698]: 2025-10-01 14:14:16.864 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:14:17 compute-0 nova_compute[192698]: 2025-10-01 14:14:17.924 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:14:17 compute-0 nova_compute[192698]: 2025-10-01 14:14:17.953 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:14:18 compute-0 podman[220286]: 2025-10-01 14:14:18.18984353 +0000 UTC m=+0.094578790 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 01 14:14:18 compute-0 nova_compute[192698]: 2025-10-01 14:14:18.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:14:18 compute-0 nova_compute[192698]: 2025-10-01 14:14:18.915 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:14:19 compute-0 nova_compute[192698]: 2025-10-01 14:14:19.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:14:21 compute-0 nova_compute[192698]: 2025-10-01 14:14:21.926 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:14:22 compute-0 nova_compute[192698]: 2025-10-01 14:14:22.956 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:14:23 compute-0 nova_compute[192698]: 2025-10-01 14:14:23.387 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:14:23 compute-0 nova_compute[192698]: 2025-10-01 14:14:23.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:14:23 compute-0 nova_compute[192698]: 2025-10-01 14:14:23.925 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 01 14:14:24 compute-0 nova_compute[192698]: 2025-10-01 14:14:24.101 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:14:27 compute-0 nova_compute[192698]: 2025-10-01 14:14:27.958 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:14:28 compute-0 nova_compute[192698]: 2025-10-01 14:14:28.389 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:14:29 compute-0 podman[203144]: time="2025-10-01T14:14:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:14:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:14:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:14:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:14:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3024 "" "Go-http-client/1.1"
Oct 01 14:14:30 compute-0 podman[220310]: 2025-10-01 14:14:30.162224625 +0000 UTC m=+0.069585144 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Oct 01 14:14:30 compute-0 podman[220311]: 2025-10-01 14:14:30.21306201 +0000 UTC m=+0.107056037 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930)
Oct 01 14:14:31 compute-0 openstack_network_exporter[205307]: ERROR   14:14:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:14:31 compute-0 openstack_network_exporter[205307]: ERROR   14:14:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:14:31 compute-0 openstack_network_exporter[205307]: ERROR   14:14:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:14:31 compute-0 openstack_network_exporter[205307]: ERROR   14:14:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:14:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:14:31 compute-0 openstack_network_exporter[205307]: ERROR   14:14:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:14:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:14:32 compute-0 nova_compute[192698]: 2025-10-01 14:14:32.960 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:14:33 compute-0 nova_compute[192698]: 2025-10-01 14:14:33.391 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:14:37 compute-0 nova_compute[192698]: 2025-10-01 14:14:37.995 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:14:38 compute-0 podman[220356]: 2025-10-01 14:14:38.180880971 +0000 UTC m=+0.094873268 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, version=9.6, name=ubi9-minimal, io.openshift.expose-services=, managed_by=edpm_ansible, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, distribution-scope=public, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct 01 14:14:38 compute-0 nova_compute[192698]: 2025-10-01 14:14:38.393 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:14:42 compute-0 nova_compute[192698]: 2025-10-01 14:14:42.015 2 DEBUG oslo_concurrency.lockutils [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Acquiring lock "75d77ae6-fd71-4357-86ca-e8d2afafce7e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:14:42 compute-0 nova_compute[192698]: 2025-10-01 14:14:42.016 2 DEBUG oslo_concurrency.lockutils [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Lock "75d77ae6-fd71-4357-86ca-e8d2afafce7e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:14:42 compute-0 nova_compute[192698]: 2025-10-01 14:14:42.522 2 DEBUG nova.compute.manager [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Oct 01 14:14:42 compute-0 nova_compute[192698]: 2025-10-01 14:14:42.999 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:14:43 compute-0 nova_compute[192698]: 2025-10-01 14:14:43.163 2 DEBUG oslo_concurrency.lockutils [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:14:43 compute-0 nova_compute[192698]: 2025-10-01 14:14:43.164 2 DEBUG oslo_concurrency.lockutils [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:14:43 compute-0 nova_compute[192698]: 2025-10-01 14:14:43.175 2 DEBUG nova.virt.hardware [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Oct 01 14:14:43 compute-0 nova_compute[192698]: 2025-10-01 14:14:43.176 2 INFO nova.compute.claims [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] Claim successful on node compute-0.ctlplane.example.com
Oct 01 14:14:43 compute-0 nova_compute[192698]: 2025-10-01 14:14:43.437 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:14:44 compute-0 podman[220377]: 2025-10-01 14:14:44.163250551 +0000 UTC m=+0.075653618 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.license=GPLv2)
Oct 01 14:14:44 compute-0 podman[220378]: 2025-10-01 14:14:44.163772555 +0000 UTC m=+0.069573003 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Oct 01 14:14:44 compute-0 nova_compute[192698]: 2025-10-01 14:14:44.278 2 DEBUG nova.compute.provider_tree [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:14:44 compute-0 nova_compute[192698]: 2025-10-01 14:14:44.788 2 DEBUG nova.scheduler.client.report [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:14:45 compute-0 nova_compute[192698]: 2025-10-01 14:14:45.299 2 DEBUG oslo_concurrency.lockutils [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.135s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:14:45 compute-0 nova_compute[192698]: 2025-10-01 14:14:45.300 2 DEBUG nova.compute.manager [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Oct 01 14:14:45 compute-0 nova_compute[192698]: 2025-10-01 14:14:45.814 2 DEBUG nova.compute.manager [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Oct 01 14:14:45 compute-0 nova_compute[192698]: 2025-10-01 14:14:45.815 2 DEBUG nova.network.neutron [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Oct 01 14:14:45 compute-0 nova_compute[192698]: 2025-10-01 14:14:45.816 2 WARNING neutronclient.v2_0.client [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:14:45 compute-0 nova_compute[192698]: 2025-10-01 14:14:45.816 2 WARNING neutronclient.v2_0.client [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:14:46 compute-0 nova_compute[192698]: 2025-10-01 14:14:46.328 2 INFO nova.virt.libvirt.driver [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 01 14:14:46 compute-0 nova_compute[192698]: 2025-10-01 14:14:46.561 2 DEBUG nova.network.neutron [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] Successfully created port: 2f0aa00e-d9e4-4287-bc1e-b4c1d2b4dc77 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Oct 01 14:14:46 compute-0 nova_compute[192698]: 2025-10-01 14:14:46.837 2 DEBUG nova.compute.manager [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Oct 01 14:14:47 compute-0 nova_compute[192698]: 2025-10-01 14:14:47.350 2 DEBUG nova.network.neutron [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] Successfully updated port: 2f0aa00e-d9e4-4287-bc1e-b4c1d2b4dc77 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Oct 01 14:14:47 compute-0 nova_compute[192698]: 2025-10-01 14:14:47.408 2 DEBUG nova.compute.manager [req-add35e1c-7ecc-4e0f-8906-53c9ab1f476c req-1e317b96-ef6c-4ced-bfcd-ad07a0863b5e 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] Received event network-changed-2f0aa00e-d9e4-4287-bc1e-b4c1d2b4dc77 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:14:47 compute-0 nova_compute[192698]: 2025-10-01 14:14:47.408 2 DEBUG nova.compute.manager [req-add35e1c-7ecc-4e0f-8906-53c9ab1f476c req-1e317b96-ef6c-4ced-bfcd-ad07a0863b5e 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] Refreshing instance network info cache due to event network-changed-2f0aa00e-d9e4-4287-bc1e-b4c1d2b4dc77. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Oct 01 14:14:47 compute-0 nova_compute[192698]: 2025-10-01 14:14:47.408 2 DEBUG oslo_concurrency.lockutils [req-add35e1c-7ecc-4e0f-8906-53c9ab1f476c req-1e317b96-ef6c-4ced-bfcd-ad07a0863b5e 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "refresh_cache-75d77ae6-fd71-4357-86ca-e8d2afafce7e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 01 14:14:47 compute-0 nova_compute[192698]: 2025-10-01 14:14:47.408 2 DEBUG oslo_concurrency.lockutils [req-add35e1c-7ecc-4e0f-8906-53c9ab1f476c req-1e317b96-ef6c-4ced-bfcd-ad07a0863b5e 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquired lock "refresh_cache-75d77ae6-fd71-4357-86ca-e8d2afafce7e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 01 14:14:47 compute-0 nova_compute[192698]: 2025-10-01 14:14:47.409 2 DEBUG nova.network.neutron [req-add35e1c-7ecc-4e0f-8906-53c9ab1f476c req-1e317b96-ef6c-4ced-bfcd-ad07a0863b5e 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] Refreshing network info cache for port 2f0aa00e-d9e4-4287-bc1e-b4c1d2b4dc77 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Oct 01 14:14:47 compute-0 nova_compute[192698]: 2025-10-01 14:14:47.858 2 DEBUG nova.compute.manager [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Oct 01 14:14:47 compute-0 nova_compute[192698]: 2025-10-01 14:14:47.860 2 DEBUG nova.virt.libvirt.driver [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Oct 01 14:14:47 compute-0 nova_compute[192698]: 2025-10-01 14:14:47.861 2 INFO nova.virt.libvirt.driver [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] Creating image(s)
Oct 01 14:14:47 compute-0 nova_compute[192698]: 2025-10-01 14:14:47.862 2 DEBUG oslo_concurrency.lockutils [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Acquiring lock "/var/lib/nova/instances/75d77ae6-fd71-4357-86ca-e8d2afafce7e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:14:47 compute-0 nova_compute[192698]: 2025-10-01 14:14:47.862 2 DEBUG oslo_concurrency.lockutils [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Lock "/var/lib/nova/instances/75d77ae6-fd71-4357-86ca-e8d2afafce7e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:14:47 compute-0 nova_compute[192698]: 2025-10-01 14:14:47.863 2 DEBUG oslo_concurrency.lockutils [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Lock "/var/lib/nova/instances/75d77ae6-fd71-4357-86ca-e8d2afafce7e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:14:47 compute-0 nova_compute[192698]: 2025-10-01 14:14:47.864 2 DEBUG oslo_utils.imageutils.format_inspector [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:14:47 compute-0 nova_compute[192698]: 2025-10-01 14:14:47.871 2 DEBUG oslo_utils.imageutils.format_inspector [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:14:47 compute-0 nova_compute[192698]: 2025-10-01 14:14:47.873 2 DEBUG oslo_concurrency.lockutils [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Acquiring lock "refresh_cache-75d77ae6-fd71-4357-86ca-e8d2afafce7e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 01 14:14:47 compute-0 nova_compute[192698]: 2025-10-01 14:14:47.873 2 DEBUG oslo_concurrency.processutils [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:14:47 compute-0 nova_compute[192698]: 2025-10-01 14:14:47.914 2 WARNING neutronclient.v2_0.client [req-add35e1c-7ecc-4e0f-8906-53c9ab1f476c req-1e317b96-ef6c-4ced-bfcd-ad07a0863b5e 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:14:47 compute-0 nova_compute[192698]: 2025-10-01 14:14:47.954 2 DEBUG oslo_concurrency.processutils [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:14:47 compute-0 nova_compute[192698]: 2025-10-01 14:14:47.954 2 DEBUG oslo_concurrency.lockutils [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Acquiring lock "f477473ce09fdc00484ca839f539813eb2fee546" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:14:47 compute-0 nova_compute[192698]: 2025-10-01 14:14:47.955 2 DEBUG oslo_concurrency.lockutils [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Lock "f477473ce09fdc00484ca839f539813eb2fee546" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:14:47 compute-0 nova_compute[192698]: 2025-10-01 14:14:47.955 2 DEBUG oslo_utils.imageutils.format_inspector [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:14:47 compute-0 nova_compute[192698]: 2025-10-01 14:14:47.959 2 DEBUG oslo_utils.imageutils.format_inspector [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:14:47 compute-0 nova_compute[192698]: 2025-10-01 14:14:47.960 2 DEBUG oslo_concurrency.processutils [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:14:48 compute-0 nova_compute[192698]: 2025-10-01 14:14:48.001 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:14:48 compute-0 nova_compute[192698]: 2025-10-01 14:14:48.024 2 DEBUG oslo_concurrency.processutils [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:14:48 compute-0 nova_compute[192698]: 2025-10-01 14:14:48.025 2 DEBUG oslo_concurrency.processutils [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546,backing_fmt=raw /var/lib/nova/instances/75d77ae6-fd71-4357-86ca-e8d2afafce7e/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:14:48 compute-0 nova_compute[192698]: 2025-10-01 14:14:48.085 2 DEBUG oslo_concurrency.processutils [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546,backing_fmt=raw /var/lib/nova/instances/75d77ae6-fd71-4357-86ca-e8d2afafce7e/disk 1073741824" returned: 0 in 0.059s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:14:48 compute-0 nova_compute[192698]: 2025-10-01 14:14:48.086 2 DEBUG oslo_concurrency.lockutils [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Lock "f477473ce09fdc00484ca839f539813eb2fee546" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.131s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:14:48 compute-0 nova_compute[192698]: 2025-10-01 14:14:48.086 2 DEBUG oslo_concurrency.processutils [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:14:48 compute-0 nova_compute[192698]: 2025-10-01 14:14:48.158 2 DEBUG oslo_concurrency.processutils [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:14:48 compute-0 nova_compute[192698]: 2025-10-01 14:14:48.159 2 DEBUG nova.virt.disk.api [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Checking if we can resize image /var/lib/nova/instances/75d77ae6-fd71-4357-86ca-e8d2afafce7e/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 01 14:14:48 compute-0 nova_compute[192698]: 2025-10-01 14:14:48.160 2 DEBUG oslo_concurrency.processutils [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/75d77ae6-fd71-4357-86ca-e8d2afafce7e/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:14:48 compute-0 nova_compute[192698]: 2025-10-01 14:14:48.174 2 DEBUG nova.network.neutron [req-add35e1c-7ecc-4e0f-8906-53c9ab1f476c req-1e317b96-ef6c-4ced-bfcd-ad07a0863b5e 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 01 14:14:48 compute-0 nova_compute[192698]: 2025-10-01 14:14:48.235 2 DEBUG oslo_concurrency.processutils [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/75d77ae6-fd71-4357-86ca-e8d2afafce7e/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:14:48 compute-0 nova_compute[192698]: 2025-10-01 14:14:48.237 2 DEBUG nova.virt.disk.api [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Cannot resize image /var/lib/nova/instances/75d77ae6-fd71-4357-86ca-e8d2afafce7e/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 01 14:14:48 compute-0 nova_compute[192698]: 2025-10-01 14:14:48.238 2 DEBUG nova.virt.libvirt.driver [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Oct 01 14:14:48 compute-0 nova_compute[192698]: 2025-10-01 14:14:48.239 2 DEBUG nova.virt.libvirt.driver [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] Ensure instance console log exists: /var/lib/nova/instances/75d77ae6-fd71-4357-86ca-e8d2afafce7e/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Oct 01 14:14:48 compute-0 nova_compute[192698]: 2025-10-01 14:14:48.240 2 DEBUG oslo_concurrency.lockutils [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:14:48 compute-0 nova_compute[192698]: 2025-10-01 14:14:48.240 2 DEBUG oslo_concurrency.lockutils [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:14:48 compute-0 nova_compute[192698]: 2025-10-01 14:14:48.241 2 DEBUG oslo_concurrency.lockutils [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:14:48 compute-0 nova_compute[192698]: 2025-10-01 14:14:48.345 2 DEBUG nova.network.neutron [req-add35e1c-7ecc-4e0f-8906-53c9ab1f476c req-1e317b96-ef6c-4ced-bfcd-ad07a0863b5e 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:14:48 compute-0 nova_compute[192698]: 2025-10-01 14:14:48.480 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:14:48 compute-0 nova_compute[192698]: 2025-10-01 14:14:48.938 2 DEBUG oslo_concurrency.lockutils [req-add35e1c-7ecc-4e0f-8906-53c9ab1f476c req-1e317b96-ef6c-4ced-bfcd-ad07a0863b5e 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Releasing lock "refresh_cache-75d77ae6-fd71-4357-86ca-e8d2afafce7e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 01 14:14:48 compute-0 nova_compute[192698]: 2025-10-01 14:14:48.939 2 DEBUG oslo_concurrency.lockutils [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Acquired lock "refresh_cache-75d77ae6-fd71-4357-86ca-e8d2afafce7e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 01 14:14:48 compute-0 nova_compute[192698]: 2025-10-01 14:14:48.940 2 DEBUG nova.network.neutron [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 01 14:14:49 compute-0 podman[220433]: 2025-10-01 14:14:49.174882466 +0000 UTC m=+0.091419765 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 01 14:14:50 compute-0 nova_compute[192698]: 2025-10-01 14:14:50.154 2 DEBUG nova.network.neutron [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 01 14:14:50 compute-0 nova_compute[192698]: 2025-10-01 14:14:50.342 2 WARNING neutronclient.v2_0.client [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:14:50 compute-0 nova_compute[192698]: 2025-10-01 14:14:50.489 2 DEBUG nova.network.neutron [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] Updating instance_info_cache with network_info: [{"id": "2f0aa00e-d9e4-4287-bc1e-b4c1d2b4dc77", "address": "fa:16:3e:61:32:e5", "network": {"id": "8562f9c0-0a2b-4e53-975b-dd543293c802", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1048948457-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8120df3906db49b8ac8fa624e2f2aad4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f0aa00e-d9", "ovs_interfaceid": "2f0aa00e-d9e4-4287-bc1e-b4c1d2b4dc77", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:14:50 compute-0 nova_compute[192698]: 2025-10-01 14:14:50.995 2 DEBUG oslo_concurrency.lockutils [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Releasing lock "refresh_cache-75d77ae6-fd71-4357-86ca-e8d2afafce7e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 01 14:14:50 compute-0 nova_compute[192698]: 2025-10-01 14:14:50.996 2 DEBUG nova.compute.manager [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] Instance network_info: |[{"id": "2f0aa00e-d9e4-4287-bc1e-b4c1d2b4dc77", "address": "fa:16:3e:61:32:e5", "network": {"id": "8562f9c0-0a2b-4e53-975b-dd543293c802", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1048948457-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8120df3906db49b8ac8fa624e2f2aad4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f0aa00e-d9", "ovs_interfaceid": "2f0aa00e-d9e4-4287-bc1e-b4c1d2b4dc77", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Oct 01 14:14:51 compute-0 nova_compute[192698]: 2025-10-01 14:14:51.001 2 DEBUG nova.virt.libvirt.driver [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] Start _get_guest_xml network_info=[{"id": "2f0aa00e-d9e4-4287-bc1e-b4c1d2b4dc77", "address": "fa:16:3e:61:32:e5", "network": {"id": "8562f9c0-0a2b-4e53-975b-dd543293c802", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1048948457-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8120df3906db49b8ac8fa624e2f2aad4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f0aa00e-d9", "ovs_interfaceid": "2f0aa00e-d9e4-4287-bc1e-b4c1d2b4dc77", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-01T13:57:39Z,direct_url=<?>,disk_format='qcow2',id=48696e9b-a20d-4bf6-8ac2-6438fe748ab6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9dacac6049d34f02846f752af09ae16f',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-01T13:57:40Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encrypted': False, 'encryption_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_options': None, 'device_name': '/dev/vda', 'guest_format': None, 'image_id': '48696e9b-a20d-4bf6-8ac2-6438fe748ab6'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Oct 01 14:14:51 compute-0 nova_compute[192698]: 2025-10-01 14:14:51.007 2 WARNING nova.virt.libvirt.driver [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:14:51 compute-0 nova_compute[192698]: 2025-10-01 14:14:51.009 2 DEBUG nova.virt.driver [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='48696e9b-a20d-4bf6-8ac2-6438fe748ab6', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteHostMaintenanceStrategy-server-70417407', uuid='75d77ae6-fd71-4357-86ca-e8d2afafce7e'), owner=OwnerMeta(userid='8e4b771b5757444093151a3e38c0b2d7', username='tempest-TestExecuteHostMaintenanceStrategy-132658549-project-admin', projectid='9f5565c36a294928af6bcd073bff4643', projectname='tempest-TestExecuteHostMaintenanceStrategy-132658549'), image=ImageMeta(id='48696e9b-a20d-4bf6-8ac2-6438fe748ab6', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='69702c4b-38f2-49d1-96d5-85671652c67e', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "2f0aa00e-d9e4-4287-bc1e-b4c1d2b4dc77", "address": "fa:16:3e:61:32:e5", "network": {"id": "8562f9c0-0a2b-4e53-975b-dd543293c802", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1048948457-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8120df3906db49b8ac8fa624e2f2aad4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f0aa00e-d9", "ovs_interfaceid": "2f0aa00e-d9e4-4287-bc1e-b4c1d2b4dc77", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759328091.0095134) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Oct 01 14:14:51 compute-0 nova_compute[192698]: 2025-10-01 14:14:51.015 2 DEBUG nova.virt.libvirt.host [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Oct 01 14:14:51 compute-0 nova_compute[192698]: 2025-10-01 14:14:51.017 2 DEBUG nova.virt.libvirt.host [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Oct 01 14:14:51 compute-0 nova_compute[192698]: 2025-10-01 14:14:51.021 2 DEBUG nova.virt.libvirt.host [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Oct 01 14:14:51 compute-0 nova_compute[192698]: 2025-10-01 14:14:51.022 2 DEBUG nova.virt.libvirt.host [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Oct 01 14:14:51 compute-0 nova_compute[192698]: 2025-10-01 14:14:51.023 2 DEBUG nova.virt.libvirt.driver [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Oct 01 14:14:51 compute-0 nova_compute[192698]: 2025-10-01 14:14:51.024 2 DEBUG nova.virt.hardware [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-01T13:57:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='69702c4b-38f2-49d1-96d5-85671652c67e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-01T13:57:39Z,direct_url=<?>,disk_format='qcow2',id=48696e9b-a20d-4bf6-8ac2-6438fe748ab6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9dacac6049d34f02846f752af09ae16f',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-01T13:57:40Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Oct 01 14:14:51 compute-0 nova_compute[192698]: 2025-10-01 14:14:51.024 2 DEBUG nova.virt.hardware [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Oct 01 14:14:51 compute-0 nova_compute[192698]: 2025-10-01 14:14:51.025 2 DEBUG nova.virt.hardware [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Oct 01 14:14:51 compute-0 nova_compute[192698]: 2025-10-01 14:14:51.025 2 DEBUG nova.virt.hardware [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Oct 01 14:14:51 compute-0 nova_compute[192698]: 2025-10-01 14:14:51.025 2 DEBUG nova.virt.hardware [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Oct 01 14:14:51 compute-0 nova_compute[192698]: 2025-10-01 14:14:51.026 2 DEBUG nova.virt.hardware [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Oct 01 14:14:51 compute-0 nova_compute[192698]: 2025-10-01 14:14:51.026 2 DEBUG nova.virt.hardware [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Oct 01 14:14:51 compute-0 nova_compute[192698]: 2025-10-01 14:14:51.027 2 DEBUG nova.virt.hardware [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Oct 01 14:14:51 compute-0 nova_compute[192698]: 2025-10-01 14:14:51.027 2 DEBUG nova.virt.hardware [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Oct 01 14:14:51 compute-0 nova_compute[192698]: 2025-10-01 14:14:51.027 2 DEBUG nova.virt.hardware [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Oct 01 14:14:51 compute-0 nova_compute[192698]: 2025-10-01 14:14:51.028 2 DEBUG nova.virt.hardware [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Oct 01 14:14:51 compute-0 nova_compute[192698]: 2025-10-01 14:14:51.036 2 DEBUG nova.virt.libvirt.vif [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-01T14:14:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-70417407',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-70417407',id=15,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9f5565c36a294928af6bcd073bff4643',ramdisk_id='',reservation_id='r-zy244pm6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-132658549',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-132658549-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-01T14:14:46Z,user_data=None,user_id='8e4b771b5757444093151a3e38c0b2d7',uuid=75d77ae6-fd71-4357-86ca-e8d2afafce7e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2f0aa00e-d9e4-4287-bc1e-b4c1d2b4dc77", "address": "fa:16:3e:61:32:e5", "network": {"id": "8562f9c0-0a2b-4e53-975b-dd543293c802", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1048948457-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8120df3906db49b8ac8fa624e2f2aad4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f0aa00e-d9", "ovs_interfaceid": "2f0aa00e-d9e4-4287-bc1e-b4c1d2b4dc77", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Oct 01 14:14:51 compute-0 nova_compute[192698]: 2025-10-01 14:14:51.037 2 DEBUG nova.network.os_vif_util [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Converting VIF {"id": "2f0aa00e-d9e4-4287-bc1e-b4c1d2b4dc77", "address": "fa:16:3e:61:32:e5", "network": {"id": "8562f9c0-0a2b-4e53-975b-dd543293c802", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1048948457-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8120df3906db49b8ac8fa624e2f2aad4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f0aa00e-d9", "ovs_interfaceid": "2f0aa00e-d9e4-4287-bc1e-b4c1d2b4dc77", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:14:51 compute-0 nova_compute[192698]: 2025-10-01 14:14:51.038 2 DEBUG nova.network.os_vif_util [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:61:32:e5,bridge_name='br-int',has_traffic_filtering=True,id=2f0aa00e-d9e4-4287-bc1e-b4c1d2b4dc77,network=Network(8562f9c0-0a2b-4e53-975b-dd543293c802),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2f0aa00e-d9') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:14:51 compute-0 nova_compute[192698]: 2025-10-01 14:14:51.040 2 DEBUG nova.objects.instance [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Lazy-loading 'pci_devices' on Instance uuid 75d77ae6-fd71-4357-86ca-e8d2afafce7e obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:14:51 compute-0 nova_compute[192698]: 2025-10-01 14:14:51.553 2 DEBUG nova.virt.libvirt.driver [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] End _get_guest_xml xml=<domain type="kvm">
Oct 01 14:14:51 compute-0 nova_compute[192698]:   <uuid>75d77ae6-fd71-4357-86ca-e8d2afafce7e</uuid>
Oct 01 14:14:51 compute-0 nova_compute[192698]:   <name>instance-0000000f</name>
Oct 01 14:14:51 compute-0 nova_compute[192698]:   <memory>131072</memory>
Oct 01 14:14:51 compute-0 nova_compute[192698]:   <vcpu>1</vcpu>
Oct 01 14:14:51 compute-0 nova_compute[192698]:   <metadata>
Oct 01 14:14:51 compute-0 nova_compute[192698]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 01 14:14:51 compute-0 nova_compute[192698]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Oct 01 14:14:51 compute-0 nova_compute[192698]:       <nova:name>tempest-TestExecuteHostMaintenanceStrategy-server-70417407</nova:name>
Oct 01 14:14:51 compute-0 nova_compute[192698]:       <nova:creationTime>2025-10-01 14:14:51</nova:creationTime>
Oct 01 14:14:51 compute-0 nova_compute[192698]:       <nova:flavor name="m1.nano" id="69702c4b-38f2-49d1-96d5-85671652c67e">
Oct 01 14:14:51 compute-0 nova_compute[192698]:         <nova:memory>128</nova:memory>
Oct 01 14:14:51 compute-0 nova_compute[192698]:         <nova:disk>1</nova:disk>
Oct 01 14:14:51 compute-0 nova_compute[192698]:         <nova:swap>0</nova:swap>
Oct 01 14:14:51 compute-0 nova_compute[192698]:         <nova:ephemeral>0</nova:ephemeral>
Oct 01 14:14:51 compute-0 nova_compute[192698]:         <nova:vcpus>1</nova:vcpus>
Oct 01 14:14:51 compute-0 nova_compute[192698]:         <nova:extraSpecs>
Oct 01 14:14:51 compute-0 nova_compute[192698]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Oct 01 14:14:51 compute-0 nova_compute[192698]:         </nova:extraSpecs>
Oct 01 14:14:51 compute-0 nova_compute[192698]:       </nova:flavor>
Oct 01 14:14:51 compute-0 nova_compute[192698]:       <nova:image uuid="48696e9b-a20d-4bf6-8ac2-6438fe748ab6">
Oct 01 14:14:51 compute-0 nova_compute[192698]:         <nova:containerFormat>bare</nova:containerFormat>
Oct 01 14:14:51 compute-0 nova_compute[192698]:         <nova:diskFormat>qcow2</nova:diskFormat>
Oct 01 14:14:51 compute-0 nova_compute[192698]:         <nova:minDisk>1</nova:minDisk>
Oct 01 14:14:51 compute-0 nova_compute[192698]:         <nova:minRam>0</nova:minRam>
Oct 01 14:14:51 compute-0 nova_compute[192698]:         <nova:properties>
Oct 01 14:14:51 compute-0 nova_compute[192698]:           <nova:property name="hw_rng_model">virtio</nova:property>
Oct 01 14:14:51 compute-0 nova_compute[192698]:         </nova:properties>
Oct 01 14:14:51 compute-0 nova_compute[192698]:       </nova:image>
Oct 01 14:14:51 compute-0 nova_compute[192698]:       <nova:owner>
Oct 01 14:14:51 compute-0 nova_compute[192698]:         <nova:user uuid="8e4b771b5757444093151a3e38c0b2d7">tempest-TestExecuteHostMaintenanceStrategy-132658549-project-admin</nova:user>
Oct 01 14:14:51 compute-0 nova_compute[192698]:         <nova:project uuid="9f5565c36a294928af6bcd073bff4643">tempest-TestExecuteHostMaintenanceStrategy-132658549</nova:project>
Oct 01 14:14:51 compute-0 nova_compute[192698]:       </nova:owner>
Oct 01 14:14:51 compute-0 nova_compute[192698]:       <nova:root type="image" uuid="48696e9b-a20d-4bf6-8ac2-6438fe748ab6"/>
Oct 01 14:14:51 compute-0 nova_compute[192698]:       <nova:ports>
Oct 01 14:14:51 compute-0 nova_compute[192698]:         <nova:port uuid="2f0aa00e-d9e4-4287-bc1e-b4c1d2b4dc77">
Oct 01 14:14:51 compute-0 nova_compute[192698]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Oct 01 14:14:51 compute-0 nova_compute[192698]:         </nova:port>
Oct 01 14:14:51 compute-0 nova_compute[192698]:       </nova:ports>
Oct 01 14:14:51 compute-0 nova_compute[192698]:     </nova:instance>
Oct 01 14:14:51 compute-0 nova_compute[192698]:   </metadata>
Oct 01 14:14:51 compute-0 nova_compute[192698]:   <sysinfo type="smbios">
Oct 01 14:14:51 compute-0 nova_compute[192698]:     <system>
Oct 01 14:14:51 compute-0 nova_compute[192698]:       <entry name="manufacturer">RDO</entry>
Oct 01 14:14:51 compute-0 nova_compute[192698]:       <entry name="product">OpenStack Compute</entry>
Oct 01 14:14:51 compute-0 nova_compute[192698]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Oct 01 14:14:51 compute-0 nova_compute[192698]:       <entry name="serial">75d77ae6-fd71-4357-86ca-e8d2afafce7e</entry>
Oct 01 14:14:51 compute-0 nova_compute[192698]:       <entry name="uuid">75d77ae6-fd71-4357-86ca-e8d2afafce7e</entry>
Oct 01 14:14:51 compute-0 nova_compute[192698]:       <entry name="family">Virtual Machine</entry>
Oct 01 14:14:51 compute-0 nova_compute[192698]:     </system>
Oct 01 14:14:51 compute-0 nova_compute[192698]:   </sysinfo>
Oct 01 14:14:51 compute-0 nova_compute[192698]:   <os>
Oct 01 14:14:51 compute-0 nova_compute[192698]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 01 14:14:51 compute-0 nova_compute[192698]:     <boot dev="hd"/>
Oct 01 14:14:51 compute-0 nova_compute[192698]:     <smbios mode="sysinfo"/>
Oct 01 14:14:51 compute-0 nova_compute[192698]:   </os>
Oct 01 14:14:51 compute-0 nova_compute[192698]:   <features>
Oct 01 14:14:51 compute-0 nova_compute[192698]:     <acpi/>
Oct 01 14:14:51 compute-0 nova_compute[192698]:     <apic/>
Oct 01 14:14:51 compute-0 nova_compute[192698]:     <vmcoreinfo/>
Oct 01 14:14:51 compute-0 nova_compute[192698]:   </features>
Oct 01 14:14:51 compute-0 nova_compute[192698]:   <clock offset="utc">
Oct 01 14:14:51 compute-0 nova_compute[192698]:     <timer name="pit" tickpolicy="delay"/>
Oct 01 14:14:51 compute-0 nova_compute[192698]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 01 14:14:51 compute-0 nova_compute[192698]:     <timer name="hpet" present="no"/>
Oct 01 14:14:51 compute-0 nova_compute[192698]:   </clock>
Oct 01 14:14:51 compute-0 nova_compute[192698]:   <cpu mode="host-model" match="exact">
Oct 01 14:14:51 compute-0 nova_compute[192698]:     <topology sockets="1" cores="1" threads="1"/>
Oct 01 14:14:51 compute-0 nova_compute[192698]:   </cpu>
Oct 01 14:14:51 compute-0 nova_compute[192698]:   <devices>
Oct 01 14:14:51 compute-0 nova_compute[192698]:     <disk type="file" device="disk">
Oct 01 14:14:51 compute-0 nova_compute[192698]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 01 14:14:51 compute-0 nova_compute[192698]:       <source file="/var/lib/nova/instances/75d77ae6-fd71-4357-86ca-e8d2afafce7e/disk"/>
Oct 01 14:14:51 compute-0 nova_compute[192698]:       <target dev="vda" bus="virtio"/>
Oct 01 14:14:51 compute-0 nova_compute[192698]:     </disk>
Oct 01 14:14:51 compute-0 nova_compute[192698]:     <disk type="file" device="cdrom">
Oct 01 14:14:51 compute-0 nova_compute[192698]:       <driver name="qemu" type="raw" cache="none"/>
Oct 01 14:14:51 compute-0 nova_compute[192698]:       <source file="/var/lib/nova/instances/75d77ae6-fd71-4357-86ca-e8d2afafce7e/disk.config"/>
Oct 01 14:14:51 compute-0 nova_compute[192698]:       <target dev="sda" bus="sata"/>
Oct 01 14:14:51 compute-0 nova_compute[192698]:     </disk>
Oct 01 14:14:51 compute-0 nova_compute[192698]:     <interface type="ethernet">
Oct 01 14:14:51 compute-0 nova_compute[192698]:       <mac address="fa:16:3e:61:32:e5"/>
Oct 01 14:14:51 compute-0 nova_compute[192698]:       <model type="virtio"/>
Oct 01 14:14:51 compute-0 nova_compute[192698]:       <driver name="vhost" rx_queue_size="512"/>
Oct 01 14:14:51 compute-0 nova_compute[192698]:       <mtu size="1442"/>
Oct 01 14:14:51 compute-0 nova_compute[192698]:       <target dev="tap2f0aa00e-d9"/>
Oct 01 14:14:51 compute-0 nova_compute[192698]:     </interface>
Oct 01 14:14:51 compute-0 nova_compute[192698]:     <serial type="pty">
Oct 01 14:14:51 compute-0 nova_compute[192698]:       <log file="/var/lib/nova/instances/75d77ae6-fd71-4357-86ca-e8d2afafce7e/console.log" append="off"/>
Oct 01 14:14:51 compute-0 nova_compute[192698]:     </serial>
Oct 01 14:14:51 compute-0 nova_compute[192698]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 01 14:14:51 compute-0 nova_compute[192698]:     <video>
Oct 01 14:14:51 compute-0 nova_compute[192698]:       <model type="virtio"/>
Oct 01 14:14:51 compute-0 nova_compute[192698]:     </video>
Oct 01 14:14:51 compute-0 nova_compute[192698]:     <input type="tablet" bus="usb"/>
Oct 01 14:14:51 compute-0 nova_compute[192698]:     <rng model="virtio">
Oct 01 14:14:51 compute-0 nova_compute[192698]:       <backend model="random">/dev/urandom</backend>
Oct 01 14:14:51 compute-0 nova_compute[192698]:     </rng>
Oct 01 14:14:51 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root"/>
Oct 01 14:14:51 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:14:51 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:14:51 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:14:51 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:14:51 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:14:51 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:14:51 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:14:51 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:14:51 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:14:51 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:14:51 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:14:51 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:14:51 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:14:51 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:14:51 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:14:51 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:14:51 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:14:51 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:14:51 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:14:51 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:14:51 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:14:51 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:14:51 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:14:51 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:14:51 compute-0 nova_compute[192698]:     <controller type="usb" index="0"/>
Oct 01 14:14:51 compute-0 nova_compute[192698]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Oct 01 14:14:51 compute-0 nova_compute[192698]:       <stats period="10"/>
Oct 01 14:14:51 compute-0 nova_compute[192698]:     </memballoon>
Oct 01 14:14:51 compute-0 nova_compute[192698]:   </devices>
Oct 01 14:14:51 compute-0 nova_compute[192698]: </domain>
Oct 01 14:14:51 compute-0 nova_compute[192698]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Oct 01 14:14:51 compute-0 nova_compute[192698]: 2025-10-01 14:14:51.555 2 DEBUG nova.compute.manager [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] Preparing to wait for external event network-vif-plugged-2f0aa00e-d9e4-4287-bc1e-b4c1d2b4dc77 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Oct 01 14:14:51 compute-0 nova_compute[192698]: 2025-10-01 14:14:51.556 2 DEBUG oslo_concurrency.lockutils [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Acquiring lock "75d77ae6-fd71-4357-86ca-e8d2afafce7e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:14:51 compute-0 nova_compute[192698]: 2025-10-01 14:14:51.556 2 DEBUG oslo_concurrency.lockutils [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Lock "75d77ae6-fd71-4357-86ca-e8d2afafce7e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:14:51 compute-0 nova_compute[192698]: 2025-10-01 14:14:51.556 2 DEBUG oslo_concurrency.lockutils [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Lock "75d77ae6-fd71-4357-86ca-e8d2afafce7e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:14:51 compute-0 nova_compute[192698]: 2025-10-01 14:14:51.557 2 DEBUG nova.virt.libvirt.vif [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-01T14:14:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-70417407',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-70417407',id=15,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9f5565c36a294928af6bcd073bff4643',ramdisk_id='',reservation_id='r-zy244pm6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-132658549',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-132658549-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-01T14:14:46Z,user_data=None,user_id='8e4b771b5757444093151a3e38c0b2d7',uuid=75d77ae6-fd71-4357-86ca-e8d2afafce7e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2f0aa00e-d9e4-4287-bc1e-b4c1d2b4dc77", "address": "fa:16:3e:61:32:e5", "network": {"id": "8562f9c0-0a2b-4e53-975b-dd543293c802", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1048948457-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8120df3906db49b8ac8fa624e2f2aad4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f0aa00e-d9", "ovs_interfaceid": "2f0aa00e-d9e4-4287-bc1e-b4c1d2b4dc77", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 01 14:14:51 compute-0 nova_compute[192698]: 2025-10-01 14:14:51.558 2 DEBUG nova.network.os_vif_util [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Converting VIF {"id": "2f0aa00e-d9e4-4287-bc1e-b4c1d2b4dc77", "address": "fa:16:3e:61:32:e5", "network": {"id": "8562f9c0-0a2b-4e53-975b-dd543293c802", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1048948457-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8120df3906db49b8ac8fa624e2f2aad4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f0aa00e-d9", "ovs_interfaceid": "2f0aa00e-d9e4-4287-bc1e-b4c1d2b4dc77", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:14:51 compute-0 nova_compute[192698]: 2025-10-01 14:14:51.558 2 DEBUG nova.network.os_vif_util [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:61:32:e5,bridge_name='br-int',has_traffic_filtering=True,id=2f0aa00e-d9e4-4287-bc1e-b4c1d2b4dc77,network=Network(8562f9c0-0a2b-4e53-975b-dd543293c802),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2f0aa00e-d9') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:14:51 compute-0 nova_compute[192698]: 2025-10-01 14:14:51.559 2 DEBUG os_vif [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:61:32:e5,bridge_name='br-int',has_traffic_filtering=True,id=2f0aa00e-d9e4-4287-bc1e-b4c1d2b4dc77,network=Network(8562f9c0-0a2b-4e53-975b-dd543293c802),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2f0aa00e-d9') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 01 14:14:51 compute-0 nova_compute[192698]: 2025-10-01 14:14:51.559 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:14:51 compute-0 nova_compute[192698]: 2025-10-01 14:14:51.560 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:14:51 compute-0 nova_compute[192698]: 2025-10-01 14:14:51.560 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:14:51 compute-0 nova_compute[192698]: 2025-10-01 14:14:51.561 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:14:51 compute-0 nova_compute[192698]: 2025-10-01 14:14:51.561 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'faaa1f58-8fea-5b98-bc17-888693398f85', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:14:51 compute-0 nova_compute[192698]: 2025-10-01 14:14:51.563 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:14:51 compute-0 nova_compute[192698]: 2025-10-01 14:14:51.565 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:14:51 compute-0 nova_compute[192698]: 2025-10-01 14:14:51.569 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:14:51 compute-0 nova_compute[192698]: 2025-10-01 14:14:51.569 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2f0aa00e-d9, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:14:51 compute-0 nova_compute[192698]: 2025-10-01 14:14:51.569 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap2f0aa00e-d9, col_values=(('qos', UUID('588d1887-b4f6-42ff-8e39-0dead67b45b8')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:14:51 compute-0 nova_compute[192698]: 2025-10-01 14:14:51.570 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap2f0aa00e-d9, col_values=(('external_ids', {'iface-id': '2f0aa00e-d9e4-4287-bc1e-b4c1d2b4dc77', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:61:32:e5', 'vm-uuid': '75d77ae6-fd71-4357-86ca-e8d2afafce7e'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:14:51 compute-0 nova_compute[192698]: 2025-10-01 14:14:51.571 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:14:51 compute-0 NetworkManager[51741]: <info>  [1759328091.5731] manager: (tap2f0aa00e-d9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/48)
Oct 01 14:14:51 compute-0 nova_compute[192698]: 2025-10-01 14:14:51.573 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:14:51 compute-0 nova_compute[192698]: 2025-10-01 14:14:51.578 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:14:51 compute-0 nova_compute[192698]: 2025-10-01 14:14:51.579 2 INFO os_vif [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:61:32:e5,bridge_name='br-int',has_traffic_filtering=True,id=2f0aa00e-d9e4-4287-bc1e-b4c1d2b4dc77,network=Network(8562f9c0-0a2b-4e53-975b-dd543293c802),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2f0aa00e-d9')
Oct 01 14:14:53 compute-0 nova_compute[192698]: 2025-10-01 14:14:53.154 2 DEBUG nova.virt.libvirt.driver [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 01 14:14:53 compute-0 nova_compute[192698]: 2025-10-01 14:14:53.155 2 DEBUG nova.virt.libvirt.driver [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 01 14:14:53 compute-0 nova_compute[192698]: 2025-10-01 14:14:53.155 2 DEBUG nova.virt.libvirt.driver [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] No VIF found with MAC fa:16:3e:61:32:e5, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Oct 01 14:14:53 compute-0 nova_compute[192698]: 2025-10-01 14:14:53.156 2 INFO nova.virt.libvirt.driver [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] Using config drive
Oct 01 14:14:53 compute-0 nova_compute[192698]: 2025-10-01 14:14:53.484 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:14:53 compute-0 nova_compute[192698]: 2025-10-01 14:14:53.670 2 WARNING neutronclient.v2_0.client [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:14:54 compute-0 nova_compute[192698]: 2025-10-01 14:14:54.235 2 INFO nova.virt.libvirt.driver [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] Creating config drive at /var/lib/nova/instances/75d77ae6-fd71-4357-86ca-e8d2afafce7e/disk.config
Oct 01 14:14:54 compute-0 nova_compute[192698]: 2025-10-01 14:14:54.244 2 DEBUG oslo_concurrency.processutils [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/75d77ae6-fd71-4357-86ca-e8d2afafce7e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpjq006hfv execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:14:54 compute-0 nova_compute[192698]: 2025-10-01 14:14:54.394 2 DEBUG oslo_concurrency.processutils [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/75d77ae6-fd71-4357-86ca-e8d2afafce7e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpjq006hfv" returned: 0 in 0.150s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:14:54 compute-0 kernel: tap2f0aa00e-d9: entered promiscuous mode
Oct 01 14:14:54 compute-0 NetworkManager[51741]: <info>  [1759328094.5024] manager: (tap2f0aa00e-d9): new Tun device (/org/freedesktop/NetworkManager/Devices/49)
Oct 01 14:14:54 compute-0 ovn_controller[94909]: 2025-10-01T14:14:54Z|00116|binding|INFO|Claiming lport 2f0aa00e-d9e4-4287-bc1e-b4c1d2b4dc77 for this chassis.
Oct 01 14:14:54 compute-0 ovn_controller[94909]: 2025-10-01T14:14:54Z|00117|binding|INFO|2f0aa00e-d9e4-4287-bc1e-b4c1d2b4dc77: Claiming fa:16:3e:61:32:e5 10.100.0.5
Oct 01 14:14:54 compute-0 nova_compute[192698]: 2025-10-01 14:14:54.501 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:14:54 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:14:54.511 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:61:32:e5 10.100.0.5'], port_security=['fa:16:3e:61:32:e5 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '75d77ae6-fd71-4357-86ca-e8d2afafce7e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8562f9c0-0a2b-4e53-975b-dd543293c802', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9f5565c36a294928af6bcd073bff4643', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd07d6cb5-684b-4a4b-83f2-c6fbca49c797', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=18a05557-2e37-4ffc-9c62-b55a7756059d, chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], logical_port=2f0aa00e-d9e4-4287-bc1e-b4c1d2b4dc77) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:14:54 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:14:54.513 103791 INFO neutron.agent.ovn.metadata.agent [-] Port 2f0aa00e-d9e4-4287-bc1e-b4c1d2b4dc77 in datapath 8562f9c0-0a2b-4e53-975b-dd543293c802 bound to our chassis
Oct 01 14:14:54 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:14:54.514 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8562f9c0-0a2b-4e53-975b-dd543293c802
Oct 01 14:14:54 compute-0 ovn_controller[94909]: 2025-10-01T14:14:54Z|00118|binding|INFO|Setting lport 2f0aa00e-d9e4-4287-bc1e-b4c1d2b4dc77 ovn-installed in OVS
Oct 01 14:14:54 compute-0 ovn_controller[94909]: 2025-10-01T14:14:54Z|00119|binding|INFO|Setting lport 2f0aa00e-d9e4-4287-bc1e-b4c1d2b4dc77 up in Southbound
Oct 01 14:14:54 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:14:54.534 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[509032b0-d385-4fab-8f11-54461a45060b]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:14:54 compute-0 nova_compute[192698]: 2025-10-01 14:14:54.534 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:14:54 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:14:54.535 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8562f9c0-01 in ovnmeta-8562f9c0-0a2b-4e53-975b-dd543293c802 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Oct 01 14:14:54 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:14:54.539 214114 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8562f9c0-00 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Oct 01 14:14:54 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:14:54.540 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[7658ac9e-1822-4f8d-8930-2bab236a9688]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:14:54 compute-0 nova_compute[192698]: 2025-10-01 14:14:54.541 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:14:54 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:14:54.542 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[449e2a76-99a0-4e5e-b7df-a69528dde04c]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:14:54 compute-0 systemd-udevd[220478]: Network interface NamePolicy= disabled on kernel command line.
Oct 01 14:14:54 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:14:54.563 103910 DEBUG oslo.privsep.daemon [-] privsep: reply[7db87387-b9e0-4976-a3d7-caff6872e3d0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:14:54 compute-0 systemd-machined[152704]: New machine qemu-10-instance-0000000f.
Oct 01 14:14:54 compute-0 NetworkManager[51741]: <info>  [1759328094.5826] device (tap2f0aa00e-d9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 01 14:14:54 compute-0 NetworkManager[51741]: <info>  [1759328094.5845] device (tap2f0aa00e-d9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 01 14:14:54 compute-0 systemd[1]: Started Virtual Machine qemu-10-instance-0000000f.
Oct 01 14:14:54 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:14:54.585 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[e32d1fea-cfda-4360-84b6-2e423bfd1ee4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:14:54 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:14:54.634 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[a4fee0e2-cf36-4a7c-aaff-a347ab0e5838]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:14:54 compute-0 NetworkManager[51741]: <info>  [1759328094.6437] manager: (tap8562f9c0-00): new Veth device (/org/freedesktop/NetworkManager/Devices/50)
Oct 01 14:14:54 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:14:54.643 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[c94ac856-fcf4-45cf-8e3c-86e45815dcb0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:14:54 compute-0 systemd-udevd[220482]: Network interface NamePolicy= disabled on kernel command line.
Oct 01 14:14:54 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:14:54.697 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[6eb5c57f-0df2-4995-be04-bc5ac25d7f17]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:14:54 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:14:54.703 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[de616dcd-6828-497d-9632-81276145209e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:14:54 compute-0 NetworkManager[51741]: <info>  [1759328094.7366] device (tap8562f9c0-00): carrier: link connected
Oct 01 14:14:54 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:14:54.744 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[20a391be-c150-4af6-b44c-05044d60cbb7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:14:54 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:14:54.777 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[48576071-cccb-46f1-a041-04605b0a94bb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8562f9c0-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f4:ed:77'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 36], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 445526, 'reachable_time': 36558, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220510, 'error': None, 'target': 'ovnmeta-8562f9c0-0a2b-4e53-975b-dd543293c802', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:14:54 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:14:54.811 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[80d5969a-1bad-4034-a02f-7709b108cff3]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef4:ed77'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 445526, 'tstamp': 445526}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220511, 'error': None, 'target': 'ovnmeta-8562f9c0-0a2b-4e53-975b-dd543293c802', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:14:54 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:14:54.838 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[df05c266-1c1a-4706-ba6a-0aa0833d9b33]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8562f9c0-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f4:ed:77'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 36], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 445526, 'reachable_time': 36558, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220512, 'error': None, 'target': 'ovnmeta-8562f9c0-0a2b-4e53-975b-dd543293c802', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:14:54 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:14:54.886 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[9c747d02-1477-4b6e-958e-aa8f67820a4e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:14:54 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:14:54.973 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[724e3e92-0df9-47d0-a15b-a1362ba7aa1b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:14:54 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:14:54.975 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8562f9c0-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:14:54 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:14:54.976 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:14:54 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:14:54.976 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8562f9c0-00, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:14:54 compute-0 nova_compute[192698]: 2025-10-01 14:14:54.978 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:14:54 compute-0 NetworkManager[51741]: <info>  [1759328094.9787] manager: (tap8562f9c0-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/51)
Oct 01 14:14:54 compute-0 kernel: tap8562f9c0-00: entered promiscuous mode
Oct 01 14:14:54 compute-0 nova_compute[192698]: 2025-10-01 14:14:54.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:14:54 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:14:54.981 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8562f9c0-00, col_values=(('external_ids', {'iface-id': 'b5ee4d88-5d32-4dfa-ae97-c0c0976243b5'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:14:54 compute-0 nova_compute[192698]: 2025-10-01 14:14:54.982 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:14:54 compute-0 ovn_controller[94909]: 2025-10-01T14:14:54Z|00120|binding|INFO|Releasing lport b5ee4d88-5d32-4dfa-ae97-c0c0976243b5 from this chassis (sb_readonly=0)
Oct 01 14:14:55 compute-0 nova_compute[192698]: 2025-10-01 14:14:55.006 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:14:55 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:14:55.009 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[96481a65-6f2d-4619-8027-5011dbbc65c1]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:14:55 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:14:55.010 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8562f9c0-0a2b-4e53-975b-dd543293c802.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8562f9c0-0a2b-4e53-975b-dd543293c802.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:14:55 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:14:55.010 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8562f9c0-0a2b-4e53-975b-dd543293c802.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8562f9c0-0a2b-4e53-975b-dd543293c802.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:14:55 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:14:55.010 103791 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 8562f9c0-0a2b-4e53-975b-dd543293c802 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Oct 01 14:14:55 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:14:55.010 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8562f9c0-0a2b-4e53-975b-dd543293c802.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8562f9c0-0a2b-4e53-975b-dd543293c802.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:14:55 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:14:55.011 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[493f7b0a-434e-49bc-95aa-ddbe5942642b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:14:55 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:14:55.011 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8562f9c0-0a2b-4e53-975b-dd543293c802.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8562f9c0-0a2b-4e53-975b-dd543293c802.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:14:55 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:14:55.012 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[27f1399f-c417-411c-8fe7-33133c5e646e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:14:55 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:14:55.012 103791 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Oct 01 14:14:55 compute-0 ovn_metadata_agent[103777]: global
Oct 01 14:14:55 compute-0 ovn_metadata_agent[103777]:     log         /dev/log local0 debug
Oct 01 14:14:55 compute-0 ovn_metadata_agent[103777]:     log-tag     haproxy-metadata-proxy-8562f9c0-0a2b-4e53-975b-dd543293c802
Oct 01 14:14:55 compute-0 ovn_metadata_agent[103777]:     user        root
Oct 01 14:14:55 compute-0 ovn_metadata_agent[103777]:     group       root
Oct 01 14:14:55 compute-0 ovn_metadata_agent[103777]:     maxconn     1024
Oct 01 14:14:55 compute-0 ovn_metadata_agent[103777]:     pidfile     /var/lib/neutron/external/pids/8562f9c0-0a2b-4e53-975b-dd543293c802.pid.haproxy
Oct 01 14:14:55 compute-0 ovn_metadata_agent[103777]:     daemon
Oct 01 14:14:55 compute-0 ovn_metadata_agent[103777]: 
Oct 01 14:14:55 compute-0 ovn_metadata_agent[103777]: defaults
Oct 01 14:14:55 compute-0 ovn_metadata_agent[103777]:     log global
Oct 01 14:14:55 compute-0 ovn_metadata_agent[103777]:     mode http
Oct 01 14:14:55 compute-0 ovn_metadata_agent[103777]:     option httplog
Oct 01 14:14:55 compute-0 ovn_metadata_agent[103777]:     option dontlognull
Oct 01 14:14:55 compute-0 ovn_metadata_agent[103777]:     option http-server-close
Oct 01 14:14:55 compute-0 ovn_metadata_agent[103777]:     option forwardfor
Oct 01 14:14:55 compute-0 ovn_metadata_agent[103777]:     retries                 3
Oct 01 14:14:55 compute-0 ovn_metadata_agent[103777]:     timeout http-request    30s
Oct 01 14:14:55 compute-0 ovn_metadata_agent[103777]:     timeout connect         30s
Oct 01 14:14:55 compute-0 ovn_metadata_agent[103777]:     timeout client          32s
Oct 01 14:14:55 compute-0 ovn_metadata_agent[103777]:     timeout server          32s
Oct 01 14:14:55 compute-0 ovn_metadata_agent[103777]:     timeout http-keep-alive 30s
Oct 01 14:14:55 compute-0 ovn_metadata_agent[103777]: 
Oct 01 14:14:55 compute-0 ovn_metadata_agent[103777]: listen listener
Oct 01 14:14:55 compute-0 ovn_metadata_agent[103777]:     bind 169.254.169.254:80
Oct 01 14:14:55 compute-0 ovn_metadata_agent[103777]:     
Oct 01 14:14:55 compute-0 ovn_metadata_agent[103777]:     server metadata /var/lib/neutron/metadata_proxy
Oct 01 14:14:55 compute-0 ovn_metadata_agent[103777]: 
Oct 01 14:14:55 compute-0 ovn_metadata_agent[103777]:     http-request add-header X-OVN-Network-ID 8562f9c0-0a2b-4e53-975b-dd543293c802
Oct 01 14:14:55 compute-0 ovn_metadata_agent[103777]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Oct 01 14:14:55 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:14:55.013 103791 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8562f9c0-0a2b-4e53-975b-dd543293c802', 'env', 'PROCESS_TAG=haproxy-8562f9c0-0a2b-4e53-975b-dd543293c802', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8562f9c0-0a2b-4e53-975b-dd543293c802.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Oct 01 14:14:55 compute-0 nova_compute[192698]: 2025-10-01 14:14:55.299 2 DEBUG nova.compute.manager [req-a49483f9-b116-439f-a6e3-e4f12b02d7b1 req-4ccc30ae-ec09-480c-8742-558b0c2161e3 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] Received event network-vif-plugged-2f0aa00e-d9e4-4287-bc1e-b4c1d2b4dc77 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:14:55 compute-0 nova_compute[192698]: 2025-10-01 14:14:55.301 2 DEBUG oslo_concurrency.lockutils [req-a49483f9-b116-439f-a6e3-e4f12b02d7b1 req-4ccc30ae-ec09-480c-8742-558b0c2161e3 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "75d77ae6-fd71-4357-86ca-e8d2afafce7e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:14:55 compute-0 nova_compute[192698]: 2025-10-01 14:14:55.301 2 DEBUG oslo_concurrency.lockutils [req-a49483f9-b116-439f-a6e3-e4f12b02d7b1 req-4ccc30ae-ec09-480c-8742-558b0c2161e3 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "75d77ae6-fd71-4357-86ca-e8d2afafce7e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:14:55 compute-0 nova_compute[192698]: 2025-10-01 14:14:55.301 2 DEBUG oslo_concurrency.lockutils [req-a49483f9-b116-439f-a6e3-e4f12b02d7b1 req-4ccc30ae-ec09-480c-8742-558b0c2161e3 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "75d77ae6-fd71-4357-86ca-e8d2afafce7e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:14:55 compute-0 nova_compute[192698]: 2025-10-01 14:14:55.302 2 DEBUG nova.compute.manager [req-a49483f9-b116-439f-a6e3-e4f12b02d7b1 req-4ccc30ae-ec09-480c-8742-558b0c2161e3 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] Processing event network-vif-plugged-2f0aa00e-d9e4-4287-bc1e-b4c1d2b4dc77 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Oct 01 14:14:55 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:14:55.398 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'e2:3f:3c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:1d:a6:67:ed:e6'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:14:55 compute-0 nova_compute[192698]: 2025-10-01 14:14:55.408 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:14:55 compute-0 podman[220550]: 2025-10-01 14:14:55.494380396 +0000 UTC m=+0.065812310 container create ebc60fee0704b9093ac90d25972a64e088132dd652f57bee27d3545e07c9fc5a (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-8562f9c0-0a2b-4e53-975b-dd543293c802, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS)
Oct 01 14:14:55 compute-0 systemd[1]: Started libpod-conmon-ebc60fee0704b9093ac90d25972a64e088132dd652f57bee27d3545e07c9fc5a.scope.
Oct 01 14:14:55 compute-0 podman[220550]: 2025-10-01 14:14:55.455447624 +0000 UTC m=+0.026879548 image pull 0c139338a67144a0d88e07ef5f38b20d3085af4a1586fd8115d3776c8f9c633c 38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Oct 01 14:14:55 compute-0 systemd[1]: Started libcrun container.
Oct 01 14:14:55 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2c48c5f5544a6d4bf02a9f3c75a1b2c1a015ac5c42c5120994bccd413c28ea5a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 01 14:14:55 compute-0 nova_compute[192698]: 2025-10-01 14:14:55.589 2 DEBUG nova.compute.manager [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Oct 01 14:14:55 compute-0 nova_compute[192698]: 2025-10-01 14:14:55.593 2 DEBUG nova.virt.libvirt.driver [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Oct 01 14:14:55 compute-0 nova_compute[192698]: 2025-10-01 14:14:55.597 2 INFO nova.virt.libvirt.driver [-] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] Instance spawned successfully.
Oct 01 14:14:55 compute-0 nova_compute[192698]: 2025-10-01 14:14:55.597 2 DEBUG nova.virt.libvirt.driver [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Oct 01 14:14:55 compute-0 podman[220550]: 2025-10-01 14:14:55.611601267 +0000 UTC m=+0.183033221 container init ebc60fee0704b9093ac90d25972a64e088132dd652f57bee27d3545e07c9fc5a (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-8562f9c0-0a2b-4e53-975b-dd543293c802, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct 01 14:14:55 compute-0 podman[220550]: 2025-10-01 14:14:55.621611468 +0000 UTC m=+0.193043382 container start ebc60fee0704b9093ac90d25972a64e088132dd652f57bee27d3545e07c9fc5a (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-8562f9c0-0a2b-4e53-975b-dd543293c802, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS)
Oct 01 14:14:55 compute-0 neutron-haproxy-ovnmeta-8562f9c0-0a2b-4e53-975b-dd543293c802[220565]: [NOTICE]   (220569) : New worker (220571) forked
Oct 01 14:14:55 compute-0 neutron-haproxy-ovnmeta-8562f9c0-0a2b-4e53-975b-dd543293c802[220565]: [NOTICE]   (220569) : Loading success.
Oct 01 14:14:55 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:14:55.703 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 01 14:14:56 compute-0 nova_compute[192698]: 2025-10-01 14:14:56.123 2 DEBUG nova.virt.libvirt.driver [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:14:56 compute-0 nova_compute[192698]: 2025-10-01 14:14:56.125 2 DEBUG nova.virt.libvirt.driver [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:14:56 compute-0 nova_compute[192698]: 2025-10-01 14:14:56.127 2 DEBUG nova.virt.libvirt.driver [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:14:56 compute-0 nova_compute[192698]: 2025-10-01 14:14:56.128 2 DEBUG nova.virt.libvirt.driver [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:14:56 compute-0 nova_compute[192698]: 2025-10-01 14:14:56.129 2 DEBUG nova.virt.libvirt.driver [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:14:56 compute-0 nova_compute[192698]: 2025-10-01 14:14:56.130 2 DEBUG nova.virt.libvirt.driver [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:14:56 compute-0 nova_compute[192698]: 2025-10-01 14:14:56.572 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:14:56 compute-0 nova_compute[192698]: 2025-10-01 14:14:56.643 2 INFO nova.compute.manager [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] Took 8.78 seconds to spawn the instance on the hypervisor.
Oct 01 14:14:56 compute-0 nova_compute[192698]: 2025-10-01 14:14:56.644 2 DEBUG nova.compute.manager [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 01 14:14:57 compute-0 nova_compute[192698]: 2025-10-01 14:14:57.194 2 INFO nova.compute.manager [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] Took 14.14 seconds to build instance.
Oct 01 14:14:57 compute-0 nova_compute[192698]: 2025-10-01 14:14:57.358 2 DEBUG nova.compute.manager [req-5bed6cc3-970b-42d2-8ed8-4e041a8c17a0 req-33c413bb-8b63-4a58-9a31-ed4a3e4a1f98 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] Received event network-vif-plugged-2f0aa00e-d9e4-4287-bc1e-b4c1d2b4dc77 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:14:57 compute-0 nova_compute[192698]: 2025-10-01 14:14:57.359 2 DEBUG oslo_concurrency.lockutils [req-5bed6cc3-970b-42d2-8ed8-4e041a8c17a0 req-33c413bb-8b63-4a58-9a31-ed4a3e4a1f98 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "75d77ae6-fd71-4357-86ca-e8d2afafce7e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:14:57 compute-0 nova_compute[192698]: 2025-10-01 14:14:57.360 2 DEBUG oslo_concurrency.lockutils [req-5bed6cc3-970b-42d2-8ed8-4e041a8c17a0 req-33c413bb-8b63-4a58-9a31-ed4a3e4a1f98 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "75d77ae6-fd71-4357-86ca-e8d2afafce7e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:14:57 compute-0 nova_compute[192698]: 2025-10-01 14:14:57.361 2 DEBUG oslo_concurrency.lockutils [req-5bed6cc3-970b-42d2-8ed8-4e041a8c17a0 req-33c413bb-8b63-4a58-9a31-ed4a3e4a1f98 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "75d77ae6-fd71-4357-86ca-e8d2afafce7e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:14:57 compute-0 nova_compute[192698]: 2025-10-01 14:14:57.361 2 DEBUG nova.compute.manager [req-5bed6cc3-970b-42d2-8ed8-4e041a8c17a0 req-33c413bb-8b63-4a58-9a31-ed4a3e4a1f98 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] No waiting events found dispatching network-vif-plugged-2f0aa00e-d9e4-4287-bc1e-b4c1d2b4dc77 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:14:57 compute-0 nova_compute[192698]: 2025-10-01 14:14:57.362 2 WARNING nova.compute.manager [req-5bed6cc3-970b-42d2-8ed8-4e041a8c17a0 req-33c413bb-8b63-4a58-9a31-ed4a3e4a1f98 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] Received unexpected event network-vif-plugged-2f0aa00e-d9e4-4287-bc1e-b4c1d2b4dc77 for instance with vm_state active and task_state None.
Oct 01 14:14:57 compute-0 nova_compute[192698]: 2025-10-01 14:14:57.701 2 DEBUG oslo_concurrency.lockutils [None req-f5cc02f3-74d8-4d6a-bfb6-5f6fef6b44b5 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Lock "75d77ae6-fd71-4357-86ca-e8d2afafce7e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.685s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:14:58 compute-0 nova_compute[192698]: 2025-10-01 14:14:58.486 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:14:58 compute-0 unix_chkpwd[220583]: password check failed for user (root)
Oct 01 14:14:58 compute-0 sshd-session[220581]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=91.224.92.32  user=root
Oct 01 14:14:59 compute-0 podman[203144]: time="2025-10-01T14:14:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:14:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:14:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20750 "" "Go-http-client/1.1"
Oct 01 14:14:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:14:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3482 "" "Go-http-client/1.1"
Oct 01 14:15:00 compute-0 sshd-session[220581]: Failed password for root from 91.224.92.32 port 41304 ssh2
Oct 01 14:15:01 compute-0 podman[220584]: 2025-10-01 14:15:01.184517689 +0000 UTC m=+0.082639707 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Oct 01 14:15:01 compute-0 podman[220585]: 2025-10-01 14:15:01.224067729 +0000 UTC m=+0.123024349 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Oct 01 14:15:01 compute-0 openstack_network_exporter[205307]: ERROR   14:15:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:15:01 compute-0 openstack_network_exporter[205307]: ERROR   14:15:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:15:01 compute-0 openstack_network_exporter[205307]: ERROR   14:15:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:15:01 compute-0 openstack_network_exporter[205307]: ERROR   14:15:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:15:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:15:01 compute-0 openstack_network_exporter[205307]: ERROR   14:15:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:15:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:15:01 compute-0 unix_chkpwd[220628]: password check failed for user (root)
Oct 01 14:15:01 compute-0 nova_compute[192698]: 2025-10-01 14:15:01.574 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:15:03 compute-0 nova_compute[192698]: 2025-10-01 14:15:03.490 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:15:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:15:03.704 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=10cf9814-09fa-4bad-879a-270f9b64eda3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:15:03 compute-0 sshd-session[220581]: Failed password for root from 91.224.92.32 port 41304 ssh2
Oct 01 14:15:04 compute-0 unix_chkpwd[220629]: password check failed for user (root)
Oct 01 14:15:06 compute-0 sshd-session[220581]: Failed password for root from 91.224.92.32 port 41304 ssh2
Oct 01 14:15:06 compute-0 nova_compute[192698]: 2025-10-01 14:15:06.576 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:15:07 compute-0 sshd-session[220581]: Received disconnect from 91.224.92.32 port 41304:11:  [preauth]
Oct 01 14:15:07 compute-0 sshd-session[220581]: Disconnected from authenticating user root 91.224.92.32 port 41304 [preauth]
Oct 01 14:15:07 compute-0 sshd-session[220581]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=91.224.92.32  user=root
Oct 01 14:15:07 compute-0 unix_chkpwd[220647]: password check failed for user (root)
Oct 01 14:15:07 compute-0 sshd-session[220645]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=91.224.92.32  user=root
Oct 01 14:15:08 compute-0 nova_compute[192698]: 2025-10-01 14:15:08.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:15:08 compute-0 ovn_controller[94909]: 2025-10-01T14:15:08Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:61:32:e5 10.100.0.5
Oct 01 14:15:08 compute-0 ovn_controller[94909]: 2025-10-01T14:15:08Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:61:32:e5 10.100.0.5
Oct 01 14:15:09 compute-0 podman[220648]: 2025-10-01 14:15:09.188373918 +0000 UTC m=+0.098264060 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.openshift.tags=minimal rhel9, version=9.6, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, container_name=openstack_network_exporter, vcs-type=git, build-date=2025-08-20T13:12:41, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Oct 01 14:15:10 compute-0 sshd-session[220645]: Failed password for root from 91.224.92.32 port 31414 ssh2
Oct 01 14:15:10 compute-0 unix_chkpwd[220670]: password check failed for user (root)
Oct 01 14:15:11 compute-0 nova_compute[192698]: 2025-10-01 14:15:11.581 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:15:11 compute-0 nova_compute[192698]: 2025-10-01 14:15:11.891 2 DEBUG nova.virt.libvirt.driver [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 4420663f-9978-437d-94b6-3b804f40c5df] Creating tmpfile /var/lib/nova/instances/tmpzq8gaowb to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Oct 01 14:15:11 compute-0 nova_compute[192698]: 2025-10-01 14:15:11.892 2 WARNING neutronclient.v2_0.client [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:15:11 compute-0 nova_compute[192698]: 2025-10-01 14:15:11.909 2 DEBUG nova.compute.manager [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpzq8gaowb',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Oct 01 14:15:12 compute-0 sshd-session[220645]: Failed password for root from 91.224.92.32 port 31414 ssh2
Oct 01 14:15:13 compute-0 nova_compute[192698]: 2025-10-01 14:15:13.497 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:15:13 compute-0 unix_chkpwd[220672]: password check failed for user (root)
Oct 01 14:15:13 compute-0 nova_compute[192698]: 2025-10-01 14:15:13.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:15:13 compute-0 nova_compute[192698]: 2025-10-01 14:15:13.955 2 WARNING neutronclient.v2_0.client [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:15:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:15:14.259 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:15:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:15:14.259 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:15:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:15:14.260 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:15:14 compute-0 nova_compute[192698]: 2025-10-01 14:15:14.444 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:15:14 compute-0 nova_compute[192698]: 2025-10-01 14:15:14.445 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:15:14 compute-0 nova_compute[192698]: 2025-10-01 14:15:14.446 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:15:14 compute-0 nova_compute[192698]: 2025-10-01 14:15:14.446 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 01 14:15:15 compute-0 podman[220674]: 2025-10-01 14:15:15.170017506 +0000 UTC m=+0.080415526 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid)
Oct 01 14:15:15 compute-0 podman[220675]: 2025-10-01 14:15:15.177864929 +0000 UTC m=+0.088398933 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible)
Oct 01 14:15:15 compute-0 nova_compute[192698]: 2025-10-01 14:15:15.522 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/75d77ae6-fd71-4357-86ca-e8d2afafce7e/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:15:15 compute-0 nova_compute[192698]: 2025-10-01 14:15:15.587 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/75d77ae6-fd71-4357-86ca-e8d2afafce7e/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:15:15 compute-0 nova_compute[192698]: 2025-10-01 14:15:15.589 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/75d77ae6-fd71-4357-86ca-e8d2afafce7e/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:15:15 compute-0 nova_compute[192698]: 2025-10-01 14:15:15.662 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/75d77ae6-fd71-4357-86ca-e8d2afafce7e/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:15:15 compute-0 sshd-session[220645]: Failed password for root from 91.224.92.32 port 31414 ssh2
Oct 01 14:15:15 compute-0 nova_compute[192698]: 2025-10-01 14:15:15.866 2 WARNING nova.virt.libvirt.driver [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:15:15 compute-0 nova_compute[192698]: 2025-10-01 14:15:15.867 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:15:15 compute-0 nova_compute[192698]: 2025-10-01 14:15:15.905 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.037s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:15:15 compute-0 nova_compute[192698]: 2025-10-01 14:15:15.905 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5674MB free_disk=73.27434921264648GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 01 14:15:15 compute-0 nova_compute[192698]: 2025-10-01 14:15:15.905 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:15:15 compute-0 nova_compute[192698]: 2025-10-01 14:15:15.906 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:15:16 compute-0 sshd-session[220645]: Received disconnect from 91.224.92.32 port 31414:11:  [preauth]
Oct 01 14:15:16 compute-0 sshd-session[220645]: Disconnected from authenticating user root 91.224.92.32 port 31414 [preauth]
Oct 01 14:15:16 compute-0 sshd-session[220645]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=91.224.92.32  user=root
Oct 01 14:15:16 compute-0 nova_compute[192698]: 2025-10-01 14:15:16.584 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:15:16 compute-0 nova_compute[192698]: 2025-10-01 14:15:16.926 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Migration for instance 4420663f-9978-437d-94b6-3b804f40c5df refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Oct 01 14:15:17 compute-0 unix_chkpwd[220721]: password check failed for user (root)
Oct 01 14:15:17 compute-0 sshd-session[220719]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=91.224.92.32  user=root
Oct 01 14:15:17 compute-0 nova_compute[192698]: 2025-10-01 14:15:17.436 2 INFO nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] [instance: 4420663f-9978-437d-94b6-3b804f40c5df] Updating resource usage from migration 95f06d28-3be4-4085-9e46-ab03ca2a2a33
Oct 01 14:15:17 compute-0 nova_compute[192698]: 2025-10-01 14:15:17.436 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] [instance: 4420663f-9978-437d-94b6-3b804f40c5df] Starting to track incoming migration 95f06d28-3be4-4085-9e46-ab03ca2a2a33 with flavor 69702c4b-38f2-49d1-96d5-85671652c67e _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Oct 01 14:15:18 compute-0 nova_compute[192698]: 2025-10-01 14:15:18.000 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Instance 75d77ae6-fd71-4357-86ca-e8d2afafce7e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Oct 01 14:15:18 compute-0 nova_compute[192698]: 2025-10-01 14:15:18.501 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:15:18 compute-0 nova_compute[192698]: 2025-10-01 14:15:18.509 2 WARNING nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Instance 4420663f-9978-437d-94b6-3b804f40c5df has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Oct 01 14:15:18 compute-0 nova_compute[192698]: 2025-10-01 14:15:18.510 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 01 14:15:18 compute-0 nova_compute[192698]: 2025-10-01 14:15:18.510 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:15:15 up  1:14,  0 user,  load average: 0.40, 0.27, 0.38\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_9f5565c36a294928af6bcd073bff4643': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 01 14:15:18 compute-0 nova_compute[192698]: 2025-10-01 14:15:18.576 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:15:18 compute-0 nova_compute[192698]: 2025-10-01 14:15:18.680 2 DEBUG nova.compute.manager [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpzq8gaowb',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='4420663f-9978-437d-94b6-3b804f40c5df',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Oct 01 14:15:19 compute-0 sshd-session[220719]: Failed password for root from 91.224.92.32 port 32088 ssh2
Oct 01 14:15:19 compute-0 nova_compute[192698]: 2025-10-01 14:15:19.084 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:15:19 compute-0 nova_compute[192698]: 2025-10-01 14:15:19.596 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 01 14:15:19 compute-0 nova_compute[192698]: 2025-10-01 14:15:19.597 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.691s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:15:19 compute-0 nova_compute[192698]: 2025-10-01 14:15:19.695 2 DEBUG oslo_concurrency.lockutils [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "refresh_cache-4420663f-9978-437d-94b6-3b804f40c5df" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 01 14:15:19 compute-0 nova_compute[192698]: 2025-10-01 14:15:19.696 2 DEBUG oslo_concurrency.lockutils [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquired lock "refresh_cache-4420663f-9978-437d-94b6-3b804f40c5df" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 01 14:15:19 compute-0 nova_compute[192698]: 2025-10-01 14:15:19.697 2 DEBUG nova.network.neutron [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 4420663f-9978-437d-94b6-3b804f40c5df] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 01 14:15:20 compute-0 unix_chkpwd[220722]: password check failed for user (root)
Oct 01 14:15:20 compute-0 podman[220723]: 2025-10-01 14:15:20.170136761 +0000 UTC m=+0.080610062 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 01 14:15:20 compute-0 nova_compute[192698]: 2025-10-01 14:15:20.209 2 WARNING neutronclient.v2_0.client [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:15:21 compute-0 sshd-session[220719]: Failed password for root from 91.224.92.32 port 32088 ssh2
Oct 01 14:15:21 compute-0 nova_compute[192698]: 2025-10-01 14:15:21.587 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:15:21 compute-0 nova_compute[192698]: 2025-10-01 14:15:21.597 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:15:21 compute-0 nova_compute[192698]: 2025-10-01 14:15:21.598 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:15:22 compute-0 nova_compute[192698]: 2025-10-01 14:15:22.109 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:15:22 compute-0 nova_compute[192698]: 2025-10-01 14:15:22.110 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:15:22 compute-0 nova_compute[192698]: 2025-10-01 14:15:22.110 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:15:22 compute-0 nova_compute[192698]: 2025-10-01 14:15:22.110 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:15:22 compute-0 nova_compute[192698]: 2025-10-01 14:15:22.110 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:15:22 compute-0 nova_compute[192698]: 2025-10-01 14:15:22.184 2 WARNING neutronclient.v2_0.client [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:15:22 compute-0 nova_compute[192698]: 2025-10-01 14:15:22.356 2 DEBUG nova.network.neutron [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 4420663f-9978-437d-94b6-3b804f40c5df] Updating instance_info_cache with network_info: [{"id": "5e3b8a03-f64b-44cd-a4f4-4fe60fae5242", "address": "fa:16:3e:cc:c8:1b", "network": {"id": "8562f9c0-0a2b-4e53-975b-dd543293c802", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1048948457-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8120df3906db49b8ac8fa624e2f2aad4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e3b8a03-f6", "ovs_interfaceid": "5e3b8a03-f64b-44cd-a4f4-4fe60fae5242", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:15:22 compute-0 unix_chkpwd[220747]: password check failed for user (root)
Oct 01 14:15:22 compute-0 nova_compute[192698]: 2025-10-01 14:15:22.863 2 DEBUG oslo_concurrency.lockutils [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Releasing lock "refresh_cache-4420663f-9978-437d-94b6-3b804f40c5df" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 01 14:15:22 compute-0 nova_compute[192698]: 2025-10-01 14:15:22.880 2 DEBUG nova.virt.libvirt.driver [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 4420663f-9978-437d-94b6-3b804f40c5df] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpzq8gaowb',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='4420663f-9978-437d-94b6-3b804f40c5df',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Oct 01 14:15:22 compute-0 nova_compute[192698]: 2025-10-01 14:15:22.881 2 DEBUG nova.virt.libvirt.driver [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 4420663f-9978-437d-94b6-3b804f40c5df] Creating instance directory: /var/lib/nova/instances/4420663f-9978-437d-94b6-3b804f40c5df pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Oct 01 14:15:22 compute-0 nova_compute[192698]: 2025-10-01 14:15:22.882 2 DEBUG nova.virt.libvirt.driver [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 4420663f-9978-437d-94b6-3b804f40c5df] Creating disk.info with the contents: {'/var/lib/nova/instances/4420663f-9978-437d-94b6-3b804f40c5df/disk': 'qcow2', '/var/lib/nova/instances/4420663f-9978-437d-94b6-3b804f40c5df/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Oct 01 14:15:22 compute-0 nova_compute[192698]: 2025-10-01 14:15:22.883 2 DEBUG nova.virt.libvirt.driver [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 4420663f-9978-437d-94b6-3b804f40c5df] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Oct 01 14:15:22 compute-0 nova_compute[192698]: 2025-10-01 14:15:22.884 2 DEBUG nova.objects.instance [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 4420663f-9978-437d-94b6-3b804f40c5df obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:15:23 compute-0 nova_compute[192698]: 2025-10-01 14:15:23.392 2 DEBUG oslo_utils.imageutils.format_inspector [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:15:23 compute-0 nova_compute[192698]: 2025-10-01 14:15:23.398 2 DEBUG oslo_utils.imageutils.format_inspector [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:15:23 compute-0 nova_compute[192698]: 2025-10-01 14:15:23.401 2 DEBUG oslo_concurrency.processutils [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:15:23 compute-0 nova_compute[192698]: 2025-10-01 14:15:23.462 2 DEBUG oslo_concurrency.processutils [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:15:23 compute-0 nova_compute[192698]: 2025-10-01 14:15:23.463 2 DEBUG oslo_concurrency.lockutils [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "f477473ce09fdc00484ca839f539813eb2fee546" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:15:23 compute-0 nova_compute[192698]: 2025-10-01 14:15:23.464 2 DEBUG oslo_concurrency.lockutils [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "f477473ce09fdc00484ca839f539813eb2fee546" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:15:23 compute-0 nova_compute[192698]: 2025-10-01 14:15:23.465 2 DEBUG oslo_utils.imageutils.format_inspector [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:15:23 compute-0 nova_compute[192698]: 2025-10-01 14:15:23.472 2 DEBUG oslo_utils.imageutils.format_inspector [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:15:23 compute-0 nova_compute[192698]: 2025-10-01 14:15:23.473 2 DEBUG oslo_concurrency.processutils [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:15:23 compute-0 nova_compute[192698]: 2025-10-01 14:15:23.503 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:15:23 compute-0 nova_compute[192698]: 2025-10-01 14:15:23.537 2 DEBUG oslo_concurrency.processutils [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:15:23 compute-0 nova_compute[192698]: 2025-10-01 14:15:23.538 2 DEBUG oslo_concurrency.processutils [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546,backing_fmt=raw /var/lib/nova/instances/4420663f-9978-437d-94b6-3b804f40c5df/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:15:23 compute-0 nova_compute[192698]: 2025-10-01 14:15:23.589 2 DEBUG oslo_concurrency.processutils [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546,backing_fmt=raw /var/lib/nova/instances/4420663f-9978-437d-94b6-3b804f40c5df/disk 1073741824" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:15:23 compute-0 nova_compute[192698]: 2025-10-01 14:15:23.590 2 DEBUG oslo_concurrency.lockutils [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "f477473ce09fdc00484ca839f539813eb2fee546" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.126s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:15:23 compute-0 nova_compute[192698]: 2025-10-01 14:15:23.591 2 DEBUG oslo_concurrency.processutils [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:15:23 compute-0 nova_compute[192698]: 2025-10-01 14:15:23.667 2 DEBUG oslo_concurrency.processutils [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:15:23 compute-0 nova_compute[192698]: 2025-10-01 14:15:23.669 2 DEBUG nova.virt.disk.api [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Checking if we can resize image /var/lib/nova/instances/4420663f-9978-437d-94b6-3b804f40c5df/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 01 14:15:23 compute-0 nova_compute[192698]: 2025-10-01 14:15:23.669 2 DEBUG oslo_concurrency.processutils [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4420663f-9978-437d-94b6-3b804f40c5df/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:15:23 compute-0 nova_compute[192698]: 2025-10-01 14:15:23.728 2 DEBUG oslo_concurrency.processutils [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4420663f-9978-437d-94b6-3b804f40c5df/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:15:23 compute-0 nova_compute[192698]: 2025-10-01 14:15:23.730 2 DEBUG nova.virt.disk.api [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Cannot resize image /var/lib/nova/instances/4420663f-9978-437d-94b6-3b804f40c5df/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 01 14:15:23 compute-0 nova_compute[192698]: 2025-10-01 14:15:23.730 2 DEBUG nova.objects.instance [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lazy-loading 'migration_context' on Instance uuid 4420663f-9978-437d-94b6-3b804f40c5df obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:15:23 compute-0 nova_compute[192698]: 2025-10-01 14:15:23.924 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:15:23 compute-0 nova_compute[192698]: 2025-10-01 14:15:23.925 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 01 14:15:24 compute-0 nova_compute[192698]: 2025-10-01 14:15:24.241 2 DEBUG nova.objects.base [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Object Instance<4420663f-9978-437d-94b6-3b804f40c5df> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Oct 01 14:15:24 compute-0 nova_compute[192698]: 2025-10-01 14:15:24.242 2 DEBUG oslo_concurrency.processutils [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/4420663f-9978-437d-94b6-3b804f40c5df/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:15:24 compute-0 nova_compute[192698]: 2025-10-01 14:15:24.279 2 DEBUG oslo_concurrency.processutils [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/4420663f-9978-437d-94b6-3b804f40c5df/disk.config 497664" returned: 0 in 0.037s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:15:24 compute-0 nova_compute[192698]: 2025-10-01 14:15:24.281 2 DEBUG nova.virt.libvirt.driver [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 4420663f-9978-437d-94b6-3b804f40c5df] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Oct 01 14:15:24 compute-0 nova_compute[192698]: 2025-10-01 14:15:24.283 2 DEBUG nova.virt.libvirt.vif [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-10-01T14:14:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-41012981',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-41012981',id=14,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-01T14:14:38Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='9f5565c36a294928af6bcd073bff4643',ramdisk_id='',reservation_id='r-ow3psbp4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-132658549',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-132658549-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-10-01T14:14:38Z,user_data=None,user_id='8e4b771b5757444093151a3e38c0b2d7',uuid=4420663f-9978-437d-94b6-3b804f40c5df,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5e3b8a03-f64b-44cd-a4f4-4fe60fae5242", "address": "fa:16:3e:cc:c8:1b", "network": {"id": "8562f9c0-0a2b-4e53-975b-dd543293c802", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1048948457-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8120df3906db49b8ac8fa624e2f2aad4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap5e3b8a03-f6", "ovs_interfaceid": "5e3b8a03-f64b-44cd-a4f4-4fe60fae5242", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 01 14:15:24 compute-0 nova_compute[192698]: 2025-10-01 14:15:24.284 2 DEBUG nova.network.os_vif_util [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Converting VIF {"id": "5e3b8a03-f64b-44cd-a4f4-4fe60fae5242", "address": "fa:16:3e:cc:c8:1b", "network": {"id": "8562f9c0-0a2b-4e53-975b-dd543293c802", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1048948457-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8120df3906db49b8ac8fa624e2f2aad4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap5e3b8a03-f6", "ovs_interfaceid": "5e3b8a03-f64b-44cd-a4f4-4fe60fae5242", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:15:24 compute-0 nova_compute[192698]: 2025-10-01 14:15:24.287 2 DEBUG nova.network.os_vif_util [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cc:c8:1b,bridge_name='br-int',has_traffic_filtering=True,id=5e3b8a03-f64b-44cd-a4f4-4fe60fae5242,network=Network(8562f9c0-0a2b-4e53-975b-dd543293c802),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5e3b8a03-f6') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:15:24 compute-0 nova_compute[192698]: 2025-10-01 14:15:24.288 2 DEBUG os_vif [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cc:c8:1b,bridge_name='br-int',has_traffic_filtering=True,id=5e3b8a03-f64b-44cd-a4f4-4fe60fae5242,network=Network(8562f9c0-0a2b-4e53-975b-dd543293c802),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5e3b8a03-f6') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 01 14:15:24 compute-0 nova_compute[192698]: 2025-10-01 14:15:24.289 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:15:24 compute-0 nova_compute[192698]: 2025-10-01 14:15:24.290 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:15:24 compute-0 nova_compute[192698]: 2025-10-01 14:15:24.291 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:15:24 compute-0 nova_compute[192698]: 2025-10-01 14:15:24.292 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:15:24 compute-0 nova_compute[192698]: 2025-10-01 14:15:24.293 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '3cc2f05c-38ae-5002-aebb-992229389a8a', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:15:24 compute-0 nova_compute[192698]: 2025-10-01 14:15:24.295 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:15:24 compute-0 nova_compute[192698]: 2025-10-01 14:15:24.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:15:24 compute-0 nova_compute[192698]: 2025-10-01 14:15:24.324 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:15:24 compute-0 nova_compute[192698]: 2025-10-01 14:15:24.328 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:15:24 compute-0 nova_compute[192698]: 2025-10-01 14:15:24.328 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5e3b8a03-f6, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:15:24 compute-0 nova_compute[192698]: 2025-10-01 14:15:24.329 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap5e3b8a03-f6, col_values=(('qos', UUID('24082f77-82a6-4e87-9385-2f6eab4325b7')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:15:24 compute-0 nova_compute[192698]: 2025-10-01 14:15:24.330 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap5e3b8a03-f6, col_values=(('external_ids', {'iface-id': '5e3b8a03-f64b-44cd-a4f4-4fe60fae5242', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cc:c8:1b', 'vm-uuid': '4420663f-9978-437d-94b6-3b804f40c5df'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:15:24 compute-0 nova_compute[192698]: 2025-10-01 14:15:24.331 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:15:24 compute-0 NetworkManager[51741]: <info>  [1759328124.3331] manager: (tap5e3b8a03-f6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/52)
Oct 01 14:15:24 compute-0 nova_compute[192698]: 2025-10-01 14:15:24.334 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:15:24 compute-0 nova_compute[192698]: 2025-10-01 14:15:24.341 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:15:24 compute-0 nova_compute[192698]: 2025-10-01 14:15:24.342 2 INFO os_vif [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cc:c8:1b,bridge_name='br-int',has_traffic_filtering=True,id=5e3b8a03-f64b-44cd-a4f4-4fe60fae5242,network=Network(8562f9c0-0a2b-4e53-975b-dd543293c802),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5e3b8a03-f6')
Oct 01 14:15:24 compute-0 nova_compute[192698]: 2025-10-01 14:15:24.343 2 DEBUG nova.virt.libvirt.driver [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Oct 01 14:15:24 compute-0 nova_compute[192698]: 2025-10-01 14:15:24.343 2 DEBUG nova.compute.manager [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpzq8gaowb',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='4420663f-9978-437d-94b6-3b804f40c5df',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Oct 01 14:15:24 compute-0 nova_compute[192698]: 2025-10-01 14:15:24.345 2 WARNING neutronclient.v2_0.client [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:15:24 compute-0 ovn_controller[94909]: 2025-10-01T14:15:24Z|00121|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Oct 01 14:15:24 compute-0 sshd-session[220719]: Failed password for root from 91.224.92.32 port 32088 ssh2
Oct 01 14:15:25 compute-0 nova_compute[192698]: 2025-10-01 14:15:25.171 2 WARNING neutronclient.v2_0.client [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:15:25 compute-0 sshd-session[220719]: Received disconnect from 91.224.92.32 port 32088:11:  [preauth]
Oct 01 14:15:25 compute-0 sshd-session[220719]: Disconnected from authenticating user root 91.224.92.32 port 32088 [preauth]
Oct 01 14:15:25 compute-0 sshd-session[220719]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=91.224.92.32  user=root
Oct 01 14:15:26 compute-0 nova_compute[192698]: 2025-10-01 14:15:26.255 2 DEBUG nova.network.neutron [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 4420663f-9978-437d-94b6-3b804f40c5df] Port 5e3b8a03-f64b-44cd-a4f4-4fe60fae5242 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Oct 01 14:15:26 compute-0 nova_compute[192698]: 2025-10-01 14:15:26.272 2 DEBUG nova.compute.manager [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpzq8gaowb',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='4420663f-9978-437d-94b6-3b804f40c5df',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Oct 01 14:15:28 compute-0 nova_compute[192698]: 2025-10-01 14:15:28.504 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:15:29 compute-0 nova_compute[192698]: 2025-10-01 14:15:29.333 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:15:29 compute-0 kernel: tap5e3b8a03-f6: entered promiscuous mode
Oct 01 14:15:29 compute-0 ovn_controller[94909]: 2025-10-01T14:15:29Z|00122|binding|INFO|Claiming lport 5e3b8a03-f64b-44cd-a4f4-4fe60fae5242 for this additional chassis.
Oct 01 14:15:29 compute-0 ovn_controller[94909]: 2025-10-01T14:15:29Z|00123|binding|INFO|5e3b8a03-f64b-44cd-a4f4-4fe60fae5242: Claiming fa:16:3e:cc:c8:1b 10.100.0.6
Oct 01 14:15:29 compute-0 NetworkManager[51741]: <info>  [1759328129.3515] manager: (tap5e3b8a03-f6): new Tun device (/org/freedesktop/NetworkManager/Devices/53)
Oct 01 14:15:29 compute-0 nova_compute[192698]: 2025-10-01 14:15:29.350 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:15:29 compute-0 ovn_controller[94909]: 2025-10-01T14:15:29Z|00124|binding|INFO|Setting lport 5e3b8a03-f64b-44cd-a4f4-4fe60fae5242 ovn-installed in OVS
Oct 01 14:15:29 compute-0 nova_compute[192698]: 2025-10-01 14:15:29.364 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:15:29 compute-0 nova_compute[192698]: 2025-10-01 14:15:29.367 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:15:29 compute-0 systemd-udevd[220783]: Network interface NamePolicy= disabled on kernel command line.
Oct 01 14:15:29 compute-0 systemd-machined[152704]: New machine qemu-11-instance-0000000e.
Oct 01 14:15:29 compute-0 systemd[1]: Started Virtual Machine qemu-11-instance-0000000e.
Oct 01 14:15:29 compute-0 NetworkManager[51741]: <info>  [1759328129.4152] device (tap5e3b8a03-f6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 01 14:15:29 compute-0 NetworkManager[51741]: <info>  [1759328129.4162] device (tap5e3b8a03-f6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 01 14:15:29 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:15:29.508 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cc:c8:1b 10.100.0.6'], port_security=['fa:16:3e:cc:c8:1b 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '4420663f-9978-437d-94b6-3b804f40c5df', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8562f9c0-0a2b-4e53-975b-dd543293c802', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9f5565c36a294928af6bcd073bff4643', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'd07d6cb5-684b-4a4b-83f2-c6fbca49c797', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=18a05557-2e37-4ffc-9c62-b55a7756059d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=5e3b8a03-f64b-44cd-a4f4-4fe60fae5242) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:15:29 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:15:29.512 103791 INFO neutron.agent.ovn.metadata.agent [-] Port 5e3b8a03-f64b-44cd-a4f4-4fe60fae5242 in datapath 8562f9c0-0a2b-4e53-975b-dd543293c802 unbound from our chassis
Oct 01 14:15:29 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:15:29.513 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8562f9c0-0a2b-4e53-975b-dd543293c802
Oct 01 14:15:29 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:15:29.537 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[624b1013-b099-4b51-bce4-fc15e121eb3a]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:15:29 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:15:29.575 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[86d659b8-7ae7-492c-8ea9-bd522aa17ca5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:15:29 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:15:29.580 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[eacd3367-11f9-4090-be43-80f411e56f34]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:15:29 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:15:29.614 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[9bc64d65-d2db-4165-9b5e-cbe144b5d686]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:15:29 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:15:29.632 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[4cd2ea89-3e16-461d-bffa-bab561cd7313]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8562f9c0-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f4:ed:77'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 36], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 445526, 'reachable_time': 36558, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220799, 'error': None, 'target': 'ovnmeta-8562f9c0-0a2b-4e53-975b-dd543293c802', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:15:29 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:15:29.652 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[73805007-1b9a-4e7e-a19c-2cef742f17fb]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap8562f9c0-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 445545, 'tstamp': 445545}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220804, 'error': None, 'target': 'ovnmeta-8562f9c0-0a2b-4e53-975b-dd543293c802', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap8562f9c0-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 445549, 'tstamp': 445549}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220804, 'error': None, 'target': 'ovnmeta-8562f9c0-0a2b-4e53-975b-dd543293c802', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:15:29 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:15:29.653 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8562f9c0-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:15:29 compute-0 nova_compute[192698]: 2025-10-01 14:15:29.655 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:15:29 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:15:29.657 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8562f9c0-00, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:15:29 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:15:29.657 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:15:29 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:15:29.657 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8562f9c0-00, col_values=(('external_ids', {'iface-id': 'b5ee4d88-5d32-4dfa-ae97-c0c0976243b5'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:15:29 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:15:29.658 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:15:29 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:15:29.660 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[9e768c82-8524-49fe-a2c4-efe33dec8910]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-8562f9c0-0a2b-4e53-975b-dd543293c802\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/8562f9c0-0a2b-4e53-975b-dd543293c802.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 8562f9c0-0a2b-4e53-975b-dd543293c802\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:15:29 compute-0 podman[203144]: time="2025-10-01T14:15:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:15:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:15:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20750 "" "Go-http-client/1.1"
Oct 01 14:15:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:15:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3480 "" "Go-http-client/1.1"
Oct 01 14:15:31 compute-0 openstack_network_exporter[205307]: ERROR   14:15:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:15:31 compute-0 openstack_network_exporter[205307]: ERROR   14:15:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:15:31 compute-0 openstack_network_exporter[205307]: ERROR   14:15:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:15:31 compute-0 openstack_network_exporter[205307]: ERROR   14:15:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:15:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:15:31 compute-0 openstack_network_exporter[205307]: ERROR   14:15:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:15:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:15:32 compute-0 podman[220820]: 2025-10-01 14:15:32.142081291 +0000 UTC m=+0.058374140 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20250930, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct 01 14:15:32 compute-0 podman[220821]: 2025-10-01 14:15:32.232568269 +0000 UTC m=+0.137778048 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Oct 01 14:15:33 compute-0 ovn_controller[94909]: 2025-10-01T14:15:33Z|00125|binding|INFO|Claiming lport 5e3b8a03-f64b-44cd-a4f4-4fe60fae5242 for this chassis.
Oct 01 14:15:33 compute-0 ovn_controller[94909]: 2025-10-01T14:15:33Z|00126|binding|INFO|5e3b8a03-f64b-44cd-a4f4-4fe60fae5242: Claiming fa:16:3e:cc:c8:1b 10.100.0.6
Oct 01 14:15:33 compute-0 ovn_controller[94909]: 2025-10-01T14:15:33Z|00127|binding|INFO|Setting lport 5e3b8a03-f64b-44cd-a4f4-4fe60fae5242 up in Southbound
Oct 01 14:15:33 compute-0 nova_compute[192698]: 2025-10-01 14:15:33.508 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:15:34 compute-0 nova_compute[192698]: 2025-10-01 14:15:34.338 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:15:34 compute-0 nova_compute[192698]: 2025-10-01 14:15:34.400 2 INFO nova.compute.manager [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 4420663f-9978-437d-94b6-3b804f40c5df] Post operation of migration started
Oct 01 14:15:34 compute-0 nova_compute[192698]: 2025-10-01 14:15:34.400 2 WARNING neutronclient.v2_0.client [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:15:35 compute-0 nova_compute[192698]: 2025-10-01 14:15:35.183 2 WARNING neutronclient.v2_0.client [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:15:35 compute-0 nova_compute[192698]: 2025-10-01 14:15:35.184 2 WARNING neutronclient.v2_0.client [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:15:35 compute-0 nova_compute[192698]: 2025-10-01 14:15:35.263 2 DEBUG oslo_concurrency.lockutils [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "refresh_cache-4420663f-9978-437d-94b6-3b804f40c5df" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 01 14:15:35 compute-0 nova_compute[192698]: 2025-10-01 14:15:35.264 2 DEBUG oslo_concurrency.lockutils [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquired lock "refresh_cache-4420663f-9978-437d-94b6-3b804f40c5df" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 01 14:15:35 compute-0 nova_compute[192698]: 2025-10-01 14:15:35.264 2 DEBUG nova.network.neutron [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 4420663f-9978-437d-94b6-3b804f40c5df] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 01 14:15:35 compute-0 nova_compute[192698]: 2025-10-01 14:15:35.775 2 WARNING neutronclient.v2_0.client [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:15:36 compute-0 nova_compute[192698]: 2025-10-01 14:15:36.196 2 WARNING neutronclient.v2_0.client [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:15:36 compute-0 nova_compute[192698]: 2025-10-01 14:15:36.363 2 DEBUG nova.network.neutron [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 4420663f-9978-437d-94b6-3b804f40c5df] Updating instance_info_cache with network_info: [{"id": "5e3b8a03-f64b-44cd-a4f4-4fe60fae5242", "address": "fa:16:3e:cc:c8:1b", "network": {"id": "8562f9c0-0a2b-4e53-975b-dd543293c802", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1048948457-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8120df3906db49b8ac8fa624e2f2aad4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e3b8a03-f6", "ovs_interfaceid": "5e3b8a03-f64b-44cd-a4f4-4fe60fae5242", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:15:36 compute-0 nova_compute[192698]: 2025-10-01 14:15:36.876 2 DEBUG oslo_concurrency.lockutils [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Releasing lock "refresh_cache-4420663f-9978-437d-94b6-3b804f40c5df" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 01 14:15:37 compute-0 nova_compute[192698]: 2025-10-01 14:15:37.399 2 DEBUG oslo_concurrency.lockutils [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:15:37 compute-0 nova_compute[192698]: 2025-10-01 14:15:37.400 2 DEBUG oslo_concurrency.lockutils [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:15:37 compute-0 nova_compute[192698]: 2025-10-01 14:15:37.401 2 DEBUG oslo_concurrency.lockutils [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:15:37 compute-0 nova_compute[192698]: 2025-10-01 14:15:37.408 2 INFO nova.virt.libvirt.driver [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 4420663f-9978-437d-94b6-3b804f40c5df] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Oct 01 14:15:37 compute-0 virtqemud[192597]: Domain id=11 name='instance-0000000e' uuid=4420663f-9978-437d-94b6-3b804f40c5df is tainted: custom-monitor
Oct 01 14:15:38 compute-0 nova_compute[192698]: 2025-10-01 14:15:38.418 2 INFO nova.virt.libvirt.driver [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 4420663f-9978-437d-94b6-3b804f40c5df] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Oct 01 14:15:38 compute-0 nova_compute[192698]: 2025-10-01 14:15:38.511 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:15:39 compute-0 nova_compute[192698]: 2025-10-01 14:15:39.342 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:15:39 compute-0 nova_compute[192698]: 2025-10-01 14:15:39.428 2 INFO nova.virt.libvirt.driver [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 4420663f-9978-437d-94b6-3b804f40c5df] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Oct 01 14:15:39 compute-0 nova_compute[192698]: 2025-10-01 14:15:39.435 2 DEBUG nova.compute.manager [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 4420663f-9978-437d-94b6-3b804f40c5df] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 01 14:15:39 compute-0 nova_compute[192698]: 2025-10-01 14:15:39.949 2 DEBUG nova.objects.instance [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 4420663f-9978-437d-94b6-3b804f40c5df] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Oct 01 14:15:40 compute-0 podman[220866]: 2025-10-01 14:15:40.184155074 +0000 UTC m=+0.091048484 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vendor=Red Hat, Inc.)
Oct 01 14:15:40 compute-0 nova_compute[192698]: 2025-10-01 14:15:40.970 2 WARNING neutronclient.v2_0.client [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:15:41 compute-0 nova_compute[192698]: 2025-10-01 14:15:41.205 2 WARNING neutronclient.v2_0.client [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:15:41 compute-0 nova_compute[192698]: 2025-10-01 14:15:41.206 2 WARNING neutronclient.v2_0.client [None req-55e6d585-d013-4285-af6e-10f774cf5e87 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:15:43 compute-0 nova_compute[192698]: 2025-10-01 14:15:43.515 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:15:44 compute-0 nova_compute[192698]: 2025-10-01 14:15:44.346 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:15:45 compute-0 nova_compute[192698]: 2025-10-01 14:15:45.438 2 DEBUG oslo_concurrency.lockutils [None req-6f673487-2f5c-49af-8df3-f79a1ef6a1dc 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Acquiring lock "75d77ae6-fd71-4357-86ca-e8d2afafce7e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:15:45 compute-0 nova_compute[192698]: 2025-10-01 14:15:45.439 2 DEBUG oslo_concurrency.lockutils [None req-6f673487-2f5c-49af-8df3-f79a1ef6a1dc 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Lock "75d77ae6-fd71-4357-86ca-e8d2afafce7e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:15:45 compute-0 nova_compute[192698]: 2025-10-01 14:15:45.439 2 DEBUG oslo_concurrency.lockutils [None req-6f673487-2f5c-49af-8df3-f79a1ef6a1dc 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Acquiring lock "75d77ae6-fd71-4357-86ca-e8d2afafce7e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:15:45 compute-0 nova_compute[192698]: 2025-10-01 14:15:45.439 2 DEBUG oslo_concurrency.lockutils [None req-6f673487-2f5c-49af-8df3-f79a1ef6a1dc 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Lock "75d77ae6-fd71-4357-86ca-e8d2afafce7e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:15:45 compute-0 nova_compute[192698]: 2025-10-01 14:15:45.439 2 DEBUG oslo_concurrency.lockutils [None req-6f673487-2f5c-49af-8df3-f79a1ef6a1dc 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Lock "75d77ae6-fd71-4357-86ca-e8d2afafce7e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:15:45 compute-0 nova_compute[192698]: 2025-10-01 14:15:45.452 2 INFO nova.compute.manager [None req-6f673487-2f5c-49af-8df3-f79a1ef6a1dc 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] Terminating instance
Oct 01 14:15:45 compute-0 podman[220888]: 2025-10-01 14:15:45.552911082 +0000 UTC m=+0.073416177 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Oct 01 14:15:45 compute-0 podman[220887]: 2025-10-01 14:15:45.553287722 +0000 UTC m=+0.077598110 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=iscsid, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 01 14:15:45 compute-0 nova_compute[192698]: 2025-10-01 14:15:45.971 2 DEBUG nova.compute.manager [None req-6f673487-2f5c-49af-8df3-f79a1ef6a1dc 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 01 14:15:45 compute-0 kernel: tap2f0aa00e-d9 (unregistering): left promiscuous mode
Oct 01 14:15:46 compute-0 NetworkManager[51741]: <info>  [1759328146.0036] device (tap2f0aa00e-d9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 01 14:15:46 compute-0 ovn_controller[94909]: 2025-10-01T14:15:46Z|00128|binding|INFO|Releasing lport 2f0aa00e-d9e4-4287-bc1e-b4c1d2b4dc77 from this chassis (sb_readonly=0)
Oct 01 14:15:46 compute-0 ovn_controller[94909]: 2025-10-01T14:15:46Z|00129|binding|INFO|Setting lport 2f0aa00e-d9e4-4287-bc1e-b4c1d2b4dc77 down in Southbound
Oct 01 14:15:46 compute-0 ovn_controller[94909]: 2025-10-01T14:15:46Z|00130|binding|INFO|Removing iface tap2f0aa00e-d9 ovn-installed in OVS
Oct 01 14:15:46 compute-0 nova_compute[192698]: 2025-10-01 14:15:46.014 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:15:46 compute-0 nova_compute[192698]: 2025-10-01 14:15:46.015 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:15:46 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:15:46.024 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:61:32:e5 10.100.0.5'], port_security=['fa:16:3e:61:32:e5 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '75d77ae6-fd71-4357-86ca-e8d2afafce7e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8562f9c0-0a2b-4e53-975b-dd543293c802', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9f5565c36a294928af6bcd073bff4643', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'd07d6cb5-684b-4a4b-83f2-c6fbca49c797', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=18a05557-2e37-4ffc-9c62-b55a7756059d, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], logical_port=2f0aa00e-d9e4-4287-bc1e-b4c1d2b4dc77) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:15:46 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:15:46.025 103791 INFO neutron.agent.ovn.metadata.agent [-] Port 2f0aa00e-d9e4-4287-bc1e-b4c1d2b4dc77 in datapath 8562f9c0-0a2b-4e53-975b-dd543293c802 unbound from our chassis
Oct 01 14:15:46 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:15:46.027 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8562f9c0-0a2b-4e53-975b-dd543293c802
Oct 01 14:15:46 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:15:46.048 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[cba7c9d6-b8e9-49fb-bc8f-d85c84c810d5]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:15:46 compute-0 nova_compute[192698]: 2025-10-01 14:15:46.083 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:15:46 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:15:46.088 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[1fea40ed-e5f1-4c33-861c-0d2b13be4f53]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:15:46 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:15:46.091 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[defda223-dd78-4735-a077-019d454c7df8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:15:46 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000f.scope: Deactivated successfully.
Oct 01 14:15:46 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000f.scope: Consumed 14.763s CPU time.
Oct 01 14:15:46 compute-0 systemd-machined[152704]: Machine qemu-10-instance-0000000f terminated.
Oct 01 14:15:46 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:15:46.131 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[2243c624-fc86-47c9-a7fd-7916625ff4c4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:15:46 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:15:46.149 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[3ee74638-15c8-473c-b1da-f644bbb304ab]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8562f9c0-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f4:ed:77'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 36], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 445526, 'reachable_time': 21562, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220938, 'error': None, 'target': 'ovnmeta-8562f9c0-0a2b-4e53-975b-dd543293c802', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:15:46 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:15:46.168 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[53cf6fd6-a0a8-4a40-a204-488cf83d37d5]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap8562f9c0-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 445545, 'tstamp': 445545}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220939, 'error': None, 'target': 'ovnmeta-8562f9c0-0a2b-4e53-975b-dd543293c802', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap8562f9c0-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 445549, 'tstamp': 445549}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220939, 'error': None, 'target': 'ovnmeta-8562f9c0-0a2b-4e53-975b-dd543293c802', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:15:46 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:15:46.170 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8562f9c0-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:15:46 compute-0 nova_compute[192698]: 2025-10-01 14:15:46.172 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:15:46 compute-0 nova_compute[192698]: 2025-10-01 14:15:46.176 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:15:46 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:15:46.176 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8562f9c0-00, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:15:46 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:15:46.176 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:15:46 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:15:46.176 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8562f9c0-00, col_values=(('external_ids', {'iface-id': 'b5ee4d88-5d32-4dfa-ae97-c0c0976243b5'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:15:46 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:15:46.176 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:15:46 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:15:46.178 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[72401b6d-8267-42e0-8706-943bbeb7cde1]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-8562f9c0-0a2b-4e53-975b-dd543293c802\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/8562f9c0-0a2b-4e53-975b-dd543293c802.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 8562f9c0-0a2b-4e53-975b-dd543293c802\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:15:46 compute-0 nova_compute[192698]: 2025-10-01 14:15:46.241 2 INFO nova.virt.libvirt.driver [-] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] Instance destroyed successfully.
Oct 01 14:15:46 compute-0 nova_compute[192698]: 2025-10-01 14:15:46.243 2 DEBUG nova.objects.instance [None req-6f673487-2f5c-49af-8df3-f79a1ef6a1dc 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Lazy-loading 'resources' on Instance uuid 75d77ae6-fd71-4357-86ca-e8d2afafce7e obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:15:46 compute-0 nova_compute[192698]: 2025-10-01 14:15:46.312 2 DEBUG nova.compute.manager [req-06dac18e-7d4c-4a3a-8dea-0c8571d3e950 req-cb13babd-2870-4212-920c-ca061aa27837 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] Received event network-vif-unplugged-2f0aa00e-d9e4-4287-bc1e-b4c1d2b4dc77 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:15:46 compute-0 nova_compute[192698]: 2025-10-01 14:15:46.312 2 DEBUG oslo_concurrency.lockutils [req-06dac18e-7d4c-4a3a-8dea-0c8571d3e950 req-cb13babd-2870-4212-920c-ca061aa27837 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "75d77ae6-fd71-4357-86ca-e8d2afafce7e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:15:46 compute-0 nova_compute[192698]: 2025-10-01 14:15:46.312 2 DEBUG oslo_concurrency.lockutils [req-06dac18e-7d4c-4a3a-8dea-0c8571d3e950 req-cb13babd-2870-4212-920c-ca061aa27837 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "75d77ae6-fd71-4357-86ca-e8d2afafce7e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:15:46 compute-0 nova_compute[192698]: 2025-10-01 14:15:46.313 2 DEBUG oslo_concurrency.lockutils [req-06dac18e-7d4c-4a3a-8dea-0c8571d3e950 req-cb13babd-2870-4212-920c-ca061aa27837 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "75d77ae6-fd71-4357-86ca-e8d2afafce7e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:15:46 compute-0 nova_compute[192698]: 2025-10-01 14:15:46.313 2 DEBUG nova.compute.manager [req-06dac18e-7d4c-4a3a-8dea-0c8571d3e950 req-cb13babd-2870-4212-920c-ca061aa27837 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] No waiting events found dispatching network-vif-unplugged-2f0aa00e-d9e4-4287-bc1e-b4c1d2b4dc77 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:15:46 compute-0 nova_compute[192698]: 2025-10-01 14:15:46.313 2 DEBUG nova.compute.manager [req-06dac18e-7d4c-4a3a-8dea-0c8571d3e950 req-cb13babd-2870-4212-920c-ca061aa27837 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] Received event network-vif-unplugged-2f0aa00e-d9e4-4287-bc1e-b4c1d2b4dc77 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 01 14:15:46 compute-0 nova_compute[192698]: 2025-10-01 14:15:46.750 2 DEBUG nova.virt.libvirt.vif [None req-6f673487-2f5c-49af-8df3-f79a1ef6a1dc 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-10-01T14:14:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-70417407',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-70417407',id=15,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-01T14:14:56Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9f5565c36a294928af6bcd073bff4643',ramdisk_id='',reservation_id='r-zy244pm6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-132658549',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-132658549-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-01T14:14:56Z,user_data=None,user_id='8e4b771b5757444093151a3e38c0b2d7',uuid=75d77ae6-fd71-4357-86ca-e8d2afafce7e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2f0aa00e-d9e4-4287-bc1e-b4c1d2b4dc77", "address": "fa:16:3e:61:32:e5", "network": {"id": "8562f9c0-0a2b-4e53-975b-dd543293c802", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1048948457-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8120df3906db49b8ac8fa624e2f2aad4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f0aa00e-d9", "ovs_interfaceid": "2f0aa00e-d9e4-4287-bc1e-b4c1d2b4dc77", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 01 14:15:46 compute-0 nova_compute[192698]: 2025-10-01 14:15:46.750 2 DEBUG nova.network.os_vif_util [None req-6f673487-2f5c-49af-8df3-f79a1ef6a1dc 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Converting VIF {"id": "2f0aa00e-d9e4-4287-bc1e-b4c1d2b4dc77", "address": "fa:16:3e:61:32:e5", "network": {"id": "8562f9c0-0a2b-4e53-975b-dd543293c802", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1048948457-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8120df3906db49b8ac8fa624e2f2aad4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f0aa00e-d9", "ovs_interfaceid": "2f0aa00e-d9e4-4287-bc1e-b4c1d2b4dc77", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:15:46 compute-0 nova_compute[192698]: 2025-10-01 14:15:46.752 2 DEBUG nova.network.os_vif_util [None req-6f673487-2f5c-49af-8df3-f79a1ef6a1dc 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:61:32:e5,bridge_name='br-int',has_traffic_filtering=True,id=2f0aa00e-d9e4-4287-bc1e-b4c1d2b4dc77,network=Network(8562f9c0-0a2b-4e53-975b-dd543293c802),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2f0aa00e-d9') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:15:46 compute-0 nova_compute[192698]: 2025-10-01 14:15:46.753 2 DEBUG os_vif [None req-6f673487-2f5c-49af-8df3-f79a1ef6a1dc 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:61:32:e5,bridge_name='br-int',has_traffic_filtering=True,id=2f0aa00e-d9e4-4287-bc1e-b4c1d2b4dc77,network=Network(8562f9c0-0a2b-4e53-975b-dd543293c802),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2f0aa00e-d9') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 01 14:15:46 compute-0 nova_compute[192698]: 2025-10-01 14:15:46.755 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:15:46 compute-0 nova_compute[192698]: 2025-10-01 14:15:46.755 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2f0aa00e-d9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:15:46 compute-0 nova_compute[192698]: 2025-10-01 14:15:46.758 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:15:46 compute-0 nova_compute[192698]: 2025-10-01 14:15:46.760 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:15:46 compute-0 nova_compute[192698]: 2025-10-01 14:15:46.761 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:15:46 compute-0 nova_compute[192698]: 2025-10-01 14:15:46.761 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=588d1887-b4f6-42ff-8e39-0dead67b45b8) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:15:46 compute-0 nova_compute[192698]: 2025-10-01 14:15:46.762 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:15:46 compute-0 nova_compute[192698]: 2025-10-01 14:15:46.764 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:15:46 compute-0 nova_compute[192698]: 2025-10-01 14:15:46.767 2 INFO os_vif [None req-6f673487-2f5c-49af-8df3-f79a1ef6a1dc 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:61:32:e5,bridge_name='br-int',has_traffic_filtering=True,id=2f0aa00e-d9e4-4287-bc1e-b4c1d2b4dc77,network=Network(8562f9c0-0a2b-4e53-975b-dd543293c802),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2f0aa00e-d9')
Oct 01 14:15:46 compute-0 nova_compute[192698]: 2025-10-01 14:15:46.768 2 INFO nova.virt.libvirt.driver [None req-6f673487-2f5c-49af-8df3-f79a1ef6a1dc 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] Deleting instance files /var/lib/nova/instances/75d77ae6-fd71-4357-86ca-e8d2afafce7e_del
Oct 01 14:15:46 compute-0 nova_compute[192698]: 2025-10-01 14:15:46.769 2 INFO nova.virt.libvirt.driver [None req-6f673487-2f5c-49af-8df3-f79a1ef6a1dc 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] Deletion of /var/lib/nova/instances/75d77ae6-fd71-4357-86ca-e8d2afafce7e_del complete
Oct 01 14:15:47 compute-0 nova_compute[192698]: 2025-10-01 14:15:47.286 2 INFO nova.compute.manager [None req-6f673487-2f5c-49af-8df3-f79a1ef6a1dc 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] Took 1.31 seconds to destroy the instance on the hypervisor.
Oct 01 14:15:47 compute-0 nova_compute[192698]: 2025-10-01 14:15:47.287 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-6f673487-2f5c-49af-8df3-f79a1ef6a1dc 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 01 14:15:47 compute-0 nova_compute[192698]: 2025-10-01 14:15:47.288 2 DEBUG nova.compute.manager [-] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 01 14:15:47 compute-0 nova_compute[192698]: 2025-10-01 14:15:47.288 2 DEBUG nova.network.neutron [-] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 01 14:15:47 compute-0 nova_compute[192698]: 2025-10-01 14:15:47.288 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:15:48 compute-0 nova_compute[192698]: 2025-10-01 14:15:48.189 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:15:48 compute-0 nova_compute[192698]: 2025-10-01 14:15:48.373 2 DEBUG nova.compute.manager [req-3ce0b97e-94cd-46e0-9c77-fc31cf55745e req-4064b57f-e999-46a2-80f2-fc8500885ff5 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] Received event network-vif-unplugged-2f0aa00e-d9e4-4287-bc1e-b4c1d2b4dc77 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:15:48 compute-0 nova_compute[192698]: 2025-10-01 14:15:48.373 2 DEBUG oslo_concurrency.lockutils [req-3ce0b97e-94cd-46e0-9c77-fc31cf55745e req-4064b57f-e999-46a2-80f2-fc8500885ff5 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "75d77ae6-fd71-4357-86ca-e8d2afafce7e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:15:48 compute-0 nova_compute[192698]: 2025-10-01 14:15:48.374 2 DEBUG oslo_concurrency.lockutils [req-3ce0b97e-94cd-46e0-9c77-fc31cf55745e req-4064b57f-e999-46a2-80f2-fc8500885ff5 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "75d77ae6-fd71-4357-86ca-e8d2afafce7e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:15:48 compute-0 nova_compute[192698]: 2025-10-01 14:15:48.374 2 DEBUG oslo_concurrency.lockutils [req-3ce0b97e-94cd-46e0-9c77-fc31cf55745e req-4064b57f-e999-46a2-80f2-fc8500885ff5 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "75d77ae6-fd71-4357-86ca-e8d2afafce7e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:15:48 compute-0 nova_compute[192698]: 2025-10-01 14:15:48.374 2 DEBUG nova.compute.manager [req-3ce0b97e-94cd-46e0-9c77-fc31cf55745e req-4064b57f-e999-46a2-80f2-fc8500885ff5 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] No waiting events found dispatching network-vif-unplugged-2f0aa00e-d9e4-4287-bc1e-b4c1d2b4dc77 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:15:48 compute-0 nova_compute[192698]: 2025-10-01 14:15:48.374 2 DEBUG nova.compute.manager [req-3ce0b97e-94cd-46e0-9c77-fc31cf55745e req-4064b57f-e999-46a2-80f2-fc8500885ff5 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] Received event network-vif-unplugged-2f0aa00e-d9e4-4287-bc1e-b4c1d2b4dc77 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 01 14:15:48 compute-0 nova_compute[192698]: 2025-10-01 14:15:48.518 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:15:49 compute-0 nova_compute[192698]: 2025-10-01 14:15:49.700 2 DEBUG nova.network.neutron [-] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:15:50 compute-0 nova_compute[192698]: 2025-10-01 14:15:50.211 2 INFO nova.compute.manager [-] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] Took 2.92 seconds to deallocate network for instance.
Oct 01 14:15:50 compute-0 nova_compute[192698]: 2025-10-01 14:15:50.428 2 DEBUG nova.compute.manager [req-7d016f6a-0ef9-439d-b645-fd09c769154e req-eee13eee-8b4f-4326-8227-87ee9fc71945 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 75d77ae6-fd71-4357-86ca-e8d2afafce7e] Received event network-vif-deleted-2f0aa00e-d9e4-4287-bc1e-b4c1d2b4dc77 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:15:50 compute-0 nova_compute[192698]: 2025-10-01 14:15:50.829 2 DEBUG oslo_concurrency.lockutils [None req-6f673487-2f5c-49af-8df3-f79a1ef6a1dc 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:15:50 compute-0 nova_compute[192698]: 2025-10-01 14:15:50.829 2 DEBUG oslo_concurrency.lockutils [None req-6f673487-2f5c-49af-8df3-f79a1ef6a1dc 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:15:50 compute-0 nova_compute[192698]: 2025-10-01 14:15:50.907 2 DEBUG nova.compute.provider_tree [None req-6f673487-2f5c-49af-8df3-f79a1ef6a1dc 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:15:51 compute-0 podman[220958]: 2025-10-01 14:15:51.180873612 +0000 UTC m=+0.088233918 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 01 14:15:51 compute-0 nova_compute[192698]: 2025-10-01 14:15:51.427 2 DEBUG nova.scheduler.client.report [None req-6f673487-2f5c-49af-8df3-f79a1ef6a1dc 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:15:51 compute-0 nova_compute[192698]: 2025-10-01 14:15:51.764 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:15:51 compute-0 nova_compute[192698]: 2025-10-01 14:15:51.937 2 DEBUG oslo_concurrency.lockutils [None req-6f673487-2f5c-49af-8df3-f79a1ef6a1dc 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.107s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:15:52 compute-0 nova_compute[192698]: 2025-10-01 14:15:52.126 2 INFO nova.scheduler.client.report [None req-6f673487-2f5c-49af-8df3-f79a1ef6a1dc 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Deleted allocations for instance 75d77ae6-fd71-4357-86ca-e8d2afafce7e
Oct 01 14:15:53 compute-0 nova_compute[192698]: 2025-10-01 14:15:53.193 2 DEBUG oslo_concurrency.lockutils [None req-6f673487-2f5c-49af-8df3-f79a1ef6a1dc 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Lock "75d77ae6-fd71-4357-86ca-e8d2afafce7e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.755s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:15:53 compute-0 nova_compute[192698]: 2025-10-01 14:15:53.519 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:15:54 compute-0 nova_compute[192698]: 2025-10-01 14:15:54.004 2 DEBUG oslo_concurrency.lockutils [None req-50a37eee-142a-4ded-a659-0917fb6d4020 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Acquiring lock "4420663f-9978-437d-94b6-3b804f40c5df" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:15:54 compute-0 nova_compute[192698]: 2025-10-01 14:15:54.005 2 DEBUG oslo_concurrency.lockutils [None req-50a37eee-142a-4ded-a659-0917fb6d4020 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Lock "4420663f-9978-437d-94b6-3b804f40c5df" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:15:54 compute-0 nova_compute[192698]: 2025-10-01 14:15:54.005 2 DEBUG oslo_concurrency.lockutils [None req-50a37eee-142a-4ded-a659-0917fb6d4020 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Acquiring lock "4420663f-9978-437d-94b6-3b804f40c5df-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:15:54 compute-0 nova_compute[192698]: 2025-10-01 14:15:54.006 2 DEBUG oslo_concurrency.lockutils [None req-50a37eee-142a-4ded-a659-0917fb6d4020 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Lock "4420663f-9978-437d-94b6-3b804f40c5df-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:15:54 compute-0 nova_compute[192698]: 2025-10-01 14:15:54.006 2 DEBUG oslo_concurrency.lockutils [None req-50a37eee-142a-4ded-a659-0917fb6d4020 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Lock "4420663f-9978-437d-94b6-3b804f40c5df-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:15:54 compute-0 nova_compute[192698]: 2025-10-01 14:15:54.023 2 INFO nova.compute.manager [None req-50a37eee-142a-4ded-a659-0917fb6d4020 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 4420663f-9978-437d-94b6-3b804f40c5df] Terminating instance
Oct 01 14:15:54 compute-0 nova_compute[192698]: 2025-10-01 14:15:54.544 2 DEBUG nova.compute.manager [None req-50a37eee-142a-4ded-a659-0917fb6d4020 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 4420663f-9978-437d-94b6-3b804f40c5df] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 01 14:15:54 compute-0 kernel: tap5e3b8a03-f6 (unregistering): left promiscuous mode
Oct 01 14:15:54 compute-0 NetworkManager[51741]: <info>  [1759328154.5826] device (tap5e3b8a03-f6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 01 14:15:54 compute-0 nova_compute[192698]: 2025-10-01 14:15:54.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:15:54 compute-0 ovn_controller[94909]: 2025-10-01T14:15:54Z|00131|binding|INFO|Releasing lport 5e3b8a03-f64b-44cd-a4f4-4fe60fae5242 from this chassis (sb_readonly=0)
Oct 01 14:15:54 compute-0 ovn_controller[94909]: 2025-10-01T14:15:54Z|00132|binding|INFO|Setting lport 5e3b8a03-f64b-44cd-a4f4-4fe60fae5242 down in Southbound
Oct 01 14:15:54 compute-0 ovn_controller[94909]: 2025-10-01T14:15:54Z|00133|binding|INFO|Removing iface tap5e3b8a03-f6 ovn-installed in OVS
Oct 01 14:15:54 compute-0 nova_compute[192698]: 2025-10-01 14:15:54.597 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:15:54 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:15:54.609 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cc:c8:1b 10.100.0.6'], port_security=['fa:16:3e:cc:c8:1b 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '4420663f-9978-437d-94b6-3b804f40c5df', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8562f9c0-0a2b-4e53-975b-dd543293c802', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9f5565c36a294928af6bcd073bff4643', 'neutron:revision_number': '15', 'neutron:security_group_ids': 'd07d6cb5-684b-4a4b-83f2-c6fbca49c797', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=18a05557-2e37-4ffc-9c62-b55a7756059d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], logical_port=5e3b8a03-f64b-44cd-a4f4-4fe60fae5242) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:15:54 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:15:54.611 103791 INFO neutron.agent.ovn.metadata.agent [-] Port 5e3b8a03-f64b-44cd-a4f4-4fe60fae5242 in datapath 8562f9c0-0a2b-4e53-975b-dd543293c802 unbound from our chassis
Oct 01 14:15:54 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:15:54.612 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8562f9c0-0a2b-4e53-975b-dd543293c802, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 01 14:15:54 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:15:54.613 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[241c0c2a-bf14-411d-b7e7-f68dcc664bdb]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:15:54 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:15:54.613 103791 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8562f9c0-0a2b-4e53-975b-dd543293c802 namespace which is not needed anymore
Oct 01 14:15:54 compute-0 nova_compute[192698]: 2025-10-01 14:15:54.623 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:15:54 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000e.scope: Deactivated successfully.
Oct 01 14:15:54 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000e.scope: Consumed 2.575s CPU time.
Oct 01 14:15:54 compute-0 systemd-machined[152704]: Machine qemu-11-instance-0000000e terminated.
Oct 01 14:15:54 compute-0 neutron-haproxy-ovnmeta-8562f9c0-0a2b-4e53-975b-dd543293c802[220565]: [NOTICE]   (220569) : haproxy version is 3.0.5-8e879a5
Oct 01 14:15:54 compute-0 neutron-haproxy-ovnmeta-8562f9c0-0a2b-4e53-975b-dd543293c802[220565]: [NOTICE]   (220569) : path to executable is /usr/sbin/haproxy
Oct 01 14:15:54 compute-0 neutron-haproxy-ovnmeta-8562f9c0-0a2b-4e53-975b-dd543293c802[220565]: [WARNING]  (220569) : Exiting Master process...
Oct 01 14:15:54 compute-0 neutron-haproxy-ovnmeta-8562f9c0-0a2b-4e53-975b-dd543293c802[220565]: [ALERT]    (220569) : Current worker (220571) exited with code 143 (Terminated)
Oct 01 14:15:54 compute-0 neutron-haproxy-ovnmeta-8562f9c0-0a2b-4e53-975b-dd543293c802[220565]: [WARNING]  (220569) : All workers exited. Exiting... (0)
Oct 01 14:15:54 compute-0 podman[221008]: 2025-10-01 14:15:54.805429181 +0000 UTC m=+0.058289118 container kill ebc60fee0704b9093ac90d25972a64e088132dd652f57bee27d3545e07c9fc5a (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-8562f9c0-0a2b-4e53-975b-dd543293c802, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 01 14:15:54 compute-0 systemd[1]: libpod-ebc60fee0704b9093ac90d25972a64e088132dd652f57bee27d3545e07c9fc5a.scope: Deactivated successfully.
Oct 01 14:15:54 compute-0 conmon[220565]: conmon ebc60fee0704b9093ac9 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-ebc60fee0704b9093ac90d25972a64e088132dd652f57bee27d3545e07c9fc5a.scope/container/memory.events
Oct 01 14:15:54 compute-0 nova_compute[192698]: 2025-10-01 14:15:54.830 2 INFO nova.virt.libvirt.driver [-] [instance: 4420663f-9978-437d-94b6-3b804f40c5df] Instance destroyed successfully.
Oct 01 14:15:54 compute-0 nova_compute[192698]: 2025-10-01 14:15:54.831 2 DEBUG nova.objects.instance [None req-50a37eee-142a-4ded-a659-0917fb6d4020 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Lazy-loading 'resources' on Instance uuid 4420663f-9978-437d-94b6-3b804f40c5df obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:15:54 compute-0 podman[221037]: 2025-10-01 14:15:54.877782289 +0000 UTC m=+0.049307645 container died ebc60fee0704b9093ac90d25972a64e088132dd652f57bee27d3545e07c9fc5a (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-8562f9c0-0a2b-4e53-975b-dd543293c802, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true)
Oct 01 14:15:55 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ebc60fee0704b9093ac90d25972a64e088132dd652f57bee27d3545e07c9fc5a-userdata-shm.mount: Deactivated successfully.
Oct 01 14:15:55 compute-0 systemd[1]: var-lib-containers-storage-overlay-2c48c5f5544a6d4bf02a9f3c75a1b2c1a015ac5c42c5120994bccd413c28ea5a-merged.mount: Deactivated successfully.
Oct 01 14:15:55 compute-0 nova_compute[192698]: 2025-10-01 14:15:55.296 2 DEBUG nova.compute.manager [req-3fe111f3-1101-45b1-b235-a0966754f3c8 req-94e692d1-5e24-4baa-b686-eaa5c6491a41 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 4420663f-9978-437d-94b6-3b804f40c5df] Received event network-vif-unplugged-5e3b8a03-f64b-44cd-a4f4-4fe60fae5242 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:15:55 compute-0 nova_compute[192698]: 2025-10-01 14:15:55.297 2 DEBUG oslo_concurrency.lockutils [req-3fe111f3-1101-45b1-b235-a0966754f3c8 req-94e692d1-5e24-4baa-b686-eaa5c6491a41 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "4420663f-9978-437d-94b6-3b804f40c5df-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:15:55 compute-0 nova_compute[192698]: 2025-10-01 14:15:55.297 2 DEBUG oslo_concurrency.lockutils [req-3fe111f3-1101-45b1-b235-a0966754f3c8 req-94e692d1-5e24-4baa-b686-eaa5c6491a41 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "4420663f-9978-437d-94b6-3b804f40c5df-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:15:55 compute-0 nova_compute[192698]: 2025-10-01 14:15:55.298 2 DEBUG oslo_concurrency.lockutils [req-3fe111f3-1101-45b1-b235-a0966754f3c8 req-94e692d1-5e24-4baa-b686-eaa5c6491a41 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "4420663f-9978-437d-94b6-3b804f40c5df-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:15:55 compute-0 nova_compute[192698]: 2025-10-01 14:15:55.298 2 DEBUG nova.compute.manager [req-3fe111f3-1101-45b1-b235-a0966754f3c8 req-94e692d1-5e24-4baa-b686-eaa5c6491a41 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 4420663f-9978-437d-94b6-3b804f40c5df] No waiting events found dispatching network-vif-unplugged-5e3b8a03-f64b-44cd-a4f4-4fe60fae5242 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:15:55 compute-0 nova_compute[192698]: 2025-10-01 14:15:55.298 2 DEBUG nova.compute.manager [req-3fe111f3-1101-45b1-b235-a0966754f3c8 req-94e692d1-5e24-4baa-b686-eaa5c6491a41 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 4420663f-9978-437d-94b6-3b804f40c5df] Received event network-vif-unplugged-5e3b8a03-f64b-44cd-a4f4-4fe60fae5242 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 01 14:15:55 compute-0 nova_compute[192698]: 2025-10-01 14:15:55.338 2 DEBUG nova.virt.libvirt.vif [None req-50a37eee-142a-4ded-a659-0917fb6d4020 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-10-01T14:14:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-41012981',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-41012981',id=14,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-01T14:14:38Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9f5565c36a294928af6bcd073bff4643',ramdisk_id='',reservation_id='r-ow3psbp4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',clean_attempts='1',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-132658549',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-132658549-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-01T14:15:40Z,user_data=None,user_id='8e4b771b5757444093151a3e38c0b2d7',uuid=4420663f-9978-437d-94b6-3b804f40c5df,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5e3b8a03-f64b-44cd-a4f4-4fe60fae5242", "address": "fa:16:3e:cc:c8:1b", "network": {"id": "8562f9c0-0a2b-4e53-975b-dd543293c802", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1048948457-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8120df3906db49b8ac8fa624e2f2aad4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e3b8a03-f6", "ovs_interfaceid": "5e3b8a03-f64b-44cd-a4f4-4fe60fae5242", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 01 14:15:55 compute-0 nova_compute[192698]: 2025-10-01 14:15:55.339 2 DEBUG nova.network.os_vif_util [None req-50a37eee-142a-4ded-a659-0917fb6d4020 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Converting VIF {"id": "5e3b8a03-f64b-44cd-a4f4-4fe60fae5242", "address": "fa:16:3e:cc:c8:1b", "network": {"id": "8562f9c0-0a2b-4e53-975b-dd543293c802", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-1048948457-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8120df3906db49b8ac8fa624e2f2aad4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e3b8a03-f6", "ovs_interfaceid": "5e3b8a03-f64b-44cd-a4f4-4fe60fae5242", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:15:55 compute-0 nova_compute[192698]: 2025-10-01 14:15:55.340 2 DEBUG nova.network.os_vif_util [None req-50a37eee-142a-4ded-a659-0917fb6d4020 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:cc:c8:1b,bridge_name='br-int',has_traffic_filtering=True,id=5e3b8a03-f64b-44cd-a4f4-4fe60fae5242,network=Network(8562f9c0-0a2b-4e53-975b-dd543293c802),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5e3b8a03-f6') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:15:55 compute-0 nova_compute[192698]: 2025-10-01 14:15:55.341 2 DEBUG os_vif [None req-50a37eee-142a-4ded-a659-0917fb6d4020 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:cc:c8:1b,bridge_name='br-int',has_traffic_filtering=True,id=5e3b8a03-f64b-44cd-a4f4-4fe60fae5242,network=Network(8562f9c0-0a2b-4e53-975b-dd543293c802),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5e3b8a03-f6') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 01 14:15:55 compute-0 nova_compute[192698]: 2025-10-01 14:15:55.344 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:15:55 compute-0 nova_compute[192698]: 2025-10-01 14:15:55.344 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5e3b8a03-f6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:15:55 compute-0 nova_compute[192698]: 2025-10-01 14:15:55.347 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:15:55 compute-0 nova_compute[192698]: 2025-10-01 14:15:55.349 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:15:55 compute-0 nova_compute[192698]: 2025-10-01 14:15:55.350 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:15:55 compute-0 nova_compute[192698]: 2025-10-01 14:15:55.350 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=24082f77-82a6-4e87-9385-2f6eab4325b7) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:15:55 compute-0 nova_compute[192698]: 2025-10-01 14:15:55.351 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:15:55 compute-0 nova_compute[192698]: 2025-10-01 14:15:55.353 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:15:55 compute-0 nova_compute[192698]: 2025-10-01 14:15:55.356 2 INFO os_vif [None req-50a37eee-142a-4ded-a659-0917fb6d4020 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:cc:c8:1b,bridge_name='br-int',has_traffic_filtering=True,id=5e3b8a03-f64b-44cd-a4f4-4fe60fae5242,network=Network(8562f9c0-0a2b-4e53-975b-dd543293c802),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5e3b8a03-f6')
Oct 01 14:15:55 compute-0 nova_compute[192698]: 2025-10-01 14:15:55.356 2 INFO nova.virt.libvirt.driver [None req-50a37eee-142a-4ded-a659-0917fb6d4020 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 4420663f-9978-437d-94b6-3b804f40c5df] Deleting instance files /var/lib/nova/instances/4420663f-9978-437d-94b6-3b804f40c5df_del
Oct 01 14:15:55 compute-0 nova_compute[192698]: 2025-10-01 14:15:55.357 2 INFO nova.virt.libvirt.driver [None req-50a37eee-142a-4ded-a659-0917fb6d4020 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 4420663f-9978-437d-94b6-3b804f40c5df] Deletion of /var/lib/nova/instances/4420663f-9978-437d-94b6-3b804f40c5df_del complete
Oct 01 14:15:55 compute-0 podman[221037]: 2025-10-01 14:15:55.487472654 +0000 UTC m=+0.658997910 container cleanup ebc60fee0704b9093ac90d25972a64e088132dd652f57bee27d3545e07c9fc5a (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-8562f9c0-0a2b-4e53-975b-dd543293c802, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4)
Oct 01 14:15:55 compute-0 systemd[1]: libpod-conmon-ebc60fee0704b9093ac90d25972a64e088132dd652f57bee27d3545e07c9fc5a.scope: Deactivated successfully.
Oct 01 14:15:55 compute-0 podman[221053]: 2025-10-01 14:15:55.538566436 +0000 UTC m=+0.636831240 container remove ebc60fee0704b9093ac90d25972a64e088132dd652f57bee27d3545e07c9fc5a (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-8562f9c0-0a2b-4e53-975b-dd543293c802, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20250930)
Oct 01 14:15:55 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:15:55.546 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[f1634715-8831-45b2-b18a-d40ae65fa8d5]: (4, ("Wed Oct  1 02:15:54 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-8562f9c0-0a2b-4e53-975b-dd543293c802 (ebc60fee0704b9093ac90d25972a64e088132dd652f57bee27d3545e07c9fc5a)\nebc60fee0704b9093ac90d25972a64e088132dd652f57bee27d3545e07c9fc5a\nWed Oct  1 02:15:54 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-8562f9c0-0a2b-4e53-975b-dd543293c802 (ebc60fee0704b9093ac90d25972a64e088132dd652f57bee27d3545e07c9fc5a)\nebc60fee0704b9093ac90d25972a64e088132dd652f57bee27d3545e07c9fc5a\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:15:55 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:15:55.548 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[54ae680c-a27d-4650-a9e7-aa2869f4d841]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:15:55 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:15:55.549 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8562f9c0-0a2b-4e53-975b-dd543293c802.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8562f9c0-0a2b-4e53-975b-dd543293c802.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:15:55 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:15:55.550 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[7ba7e7d7-01a1-4455-af0a-818f2dfd08a8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:15:55 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:15:55.551 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8562f9c0-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:15:55 compute-0 nova_compute[192698]: 2025-10-01 14:15:55.555 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:15:55 compute-0 kernel: tap8562f9c0-00: left promiscuous mode
Oct 01 14:15:55 compute-0 nova_compute[192698]: 2025-10-01 14:15:55.575 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:15:55 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:15:55.577 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[a799637f-53d8-42dc-b0af-7081ede3bd69]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:15:55 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:15:55.611 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[f02df447-ad0a-4959-93ab-ede30b1771a3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:15:55 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:15:55.612 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[22617e1f-7fa7-450d-b9c5-153877770043]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:15:55 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:15:55.638 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[125bcf68-09a6-44d1-b47e-860c5b669814]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 445515, 'reachable_time': 18918, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221071, 'error': None, 'target': 'ovnmeta-8562f9c0-0a2b-4e53-975b-dd543293c802', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:15:55 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:15:55.642 103910 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8562f9c0-0a2b-4e53-975b-dd543293c802 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Oct 01 14:15:55 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:15:55.643 103910 DEBUG oslo.privsep.daemon [-] privsep: reply[cb253b63-f571-4781-a268-d12fd2770f53]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:15:55 compute-0 systemd[1]: run-netns-ovnmeta\x2d8562f9c0\x2d0a2b\x2d4e53\x2d975b\x2ddd543293c802.mount: Deactivated successfully.
Oct 01 14:15:55 compute-0 nova_compute[192698]: 2025-10-01 14:15:55.871 2 INFO nova.compute.manager [None req-50a37eee-142a-4ded-a659-0917fb6d4020 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] [instance: 4420663f-9978-437d-94b6-3b804f40c5df] Took 1.33 seconds to destroy the instance on the hypervisor.
Oct 01 14:15:55 compute-0 nova_compute[192698]: 2025-10-01 14:15:55.871 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-50a37eee-142a-4ded-a659-0917fb6d4020 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 01 14:15:55 compute-0 nova_compute[192698]: 2025-10-01 14:15:55.871 2 DEBUG nova.compute.manager [-] [instance: 4420663f-9978-437d-94b6-3b804f40c5df] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 01 14:15:55 compute-0 nova_compute[192698]: 2025-10-01 14:15:55.872 2 DEBUG nova.network.neutron [-] [instance: 4420663f-9978-437d-94b6-3b804f40c5df] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 01 14:15:55 compute-0 nova_compute[192698]: 2025-10-01 14:15:55.872 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:15:56 compute-0 nova_compute[192698]: 2025-10-01 14:15:56.191 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:15:56 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:15:56.442 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'e2:3f:3c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:1d:a6:67:ed:e6'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:15:56 compute-0 nova_compute[192698]: 2025-10-01 14:15:56.443 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:15:56 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:15:56.443 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 01 14:15:57 compute-0 nova_compute[192698]: 2025-10-01 14:15:57.001 2 DEBUG nova.network.neutron [-] [instance: 4420663f-9978-437d-94b6-3b804f40c5df] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:15:57 compute-0 nova_compute[192698]: 2025-10-01 14:15:57.353 2 DEBUG nova.compute.manager [req-236e31d5-7149-47c1-a4a7-8a5d2e7a679b req-1ea118a8-61db-4d36-924e-7cea4fb5e4b6 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 4420663f-9978-437d-94b6-3b804f40c5df] Received event network-vif-unplugged-5e3b8a03-f64b-44cd-a4f4-4fe60fae5242 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:15:57 compute-0 nova_compute[192698]: 2025-10-01 14:15:57.354 2 DEBUG oslo_concurrency.lockutils [req-236e31d5-7149-47c1-a4a7-8a5d2e7a679b req-1ea118a8-61db-4d36-924e-7cea4fb5e4b6 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "4420663f-9978-437d-94b6-3b804f40c5df-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:15:57 compute-0 nova_compute[192698]: 2025-10-01 14:15:57.354 2 DEBUG oslo_concurrency.lockutils [req-236e31d5-7149-47c1-a4a7-8a5d2e7a679b req-1ea118a8-61db-4d36-924e-7cea4fb5e4b6 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "4420663f-9978-437d-94b6-3b804f40c5df-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:15:57 compute-0 nova_compute[192698]: 2025-10-01 14:15:57.355 2 DEBUG oslo_concurrency.lockutils [req-236e31d5-7149-47c1-a4a7-8a5d2e7a679b req-1ea118a8-61db-4d36-924e-7cea4fb5e4b6 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "4420663f-9978-437d-94b6-3b804f40c5df-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:15:57 compute-0 nova_compute[192698]: 2025-10-01 14:15:57.355 2 DEBUG nova.compute.manager [req-236e31d5-7149-47c1-a4a7-8a5d2e7a679b req-1ea118a8-61db-4d36-924e-7cea4fb5e4b6 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 4420663f-9978-437d-94b6-3b804f40c5df] No waiting events found dispatching network-vif-unplugged-5e3b8a03-f64b-44cd-a4f4-4fe60fae5242 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:15:57 compute-0 nova_compute[192698]: 2025-10-01 14:15:57.356 2 DEBUG nova.compute.manager [req-236e31d5-7149-47c1-a4a7-8a5d2e7a679b req-1ea118a8-61db-4d36-924e-7cea4fb5e4b6 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 4420663f-9978-437d-94b6-3b804f40c5df] Received event network-vif-unplugged-5e3b8a03-f64b-44cd-a4f4-4fe60fae5242 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 01 14:15:57 compute-0 nova_compute[192698]: 2025-10-01 14:15:57.356 2 DEBUG nova.compute.manager [req-236e31d5-7149-47c1-a4a7-8a5d2e7a679b req-1ea118a8-61db-4d36-924e-7cea4fb5e4b6 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 4420663f-9978-437d-94b6-3b804f40c5df] Received event network-vif-deleted-5e3b8a03-f64b-44cd-a4f4-4fe60fae5242 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:15:57 compute-0 nova_compute[192698]: 2025-10-01 14:15:57.509 2 INFO nova.compute.manager [-] [instance: 4420663f-9978-437d-94b6-3b804f40c5df] Took 1.64 seconds to deallocate network for instance.
Oct 01 14:15:58 compute-0 nova_compute[192698]: 2025-10-01 14:15:58.030 2 DEBUG oslo_concurrency.lockutils [None req-50a37eee-142a-4ded-a659-0917fb6d4020 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:15:58 compute-0 nova_compute[192698]: 2025-10-01 14:15:58.031 2 DEBUG oslo_concurrency.lockutils [None req-50a37eee-142a-4ded-a659-0917fb6d4020 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:15:58 compute-0 nova_compute[192698]: 2025-10-01 14:15:58.080 2 DEBUG oslo_concurrency.lockutils [None req-50a37eee-142a-4ded-a659-0917fb6d4020 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.049s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:15:58 compute-0 nova_compute[192698]: 2025-10-01 14:15:58.181 2 INFO nova.scheduler.client.report [None req-50a37eee-142a-4ded-a659-0917fb6d4020 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Deleted allocations for instance 4420663f-9978-437d-94b6-3b804f40c5df
Oct 01 14:15:58 compute-0 nova_compute[192698]: 2025-10-01 14:15:58.521 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:15:59 compute-0 nova_compute[192698]: 2025-10-01 14:15:59.223 2 DEBUG oslo_concurrency.lockutils [None req-50a37eee-142a-4ded-a659-0917fb6d4020 8e4b771b5757444093151a3e38c0b2d7 9f5565c36a294928af6bcd073bff4643 - - default default] Lock "4420663f-9978-437d-94b6-3b804f40c5df" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.218s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:15:59 compute-0 podman[203144]: time="2025-10-01T14:15:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:15:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:15:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:15:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:15:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3021 "" "Go-http-client/1.1"
Oct 01 14:16:00 compute-0 nova_compute[192698]: 2025-10-01 14:16:00.352 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:16:01 compute-0 openstack_network_exporter[205307]: ERROR   14:16:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:16:01 compute-0 openstack_network_exporter[205307]: ERROR   14:16:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:16:01 compute-0 openstack_network_exporter[205307]: ERROR   14:16:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:16:01 compute-0 openstack_network_exporter[205307]: ERROR   14:16:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:16:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:16:01 compute-0 openstack_network_exporter[205307]: ERROR   14:16:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:16:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:16:01 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:16:01.446 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=10cf9814-09fa-4bad-879a-270f9b64eda3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:16:03 compute-0 podman[221073]: 2025-10-01 14:16:03.201682873 +0000 UTC m=+0.110238817 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Oct 01 14:16:03 compute-0 podman[221074]: 2025-10-01 14:16:03.249095538 +0000 UTC m=+0.147430055 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4)
Oct 01 14:16:03 compute-0 nova_compute[192698]: 2025-10-01 14:16:03.421 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:16:03 compute-0 nova_compute[192698]: 2025-10-01 14:16:03.524 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:16:05 compute-0 nova_compute[192698]: 2025-10-01 14:16:05.355 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:16:08 compute-0 nova_compute[192698]: 2025-10-01 14:16:08.527 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:16:10 compute-0 nova_compute[192698]: 2025-10-01 14:16:10.358 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:16:11 compute-0 podman[221120]: 2025-10-01 14:16:11.180112538 +0000 UTC m=+0.087062029 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.buildah.version=1.33.7, name=ubi9-minimal, vendor=Red Hat, Inc., architecture=x86_64, version=9.6, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., vcs-type=git, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Oct 01 14:16:12 compute-0 sshd-session[221143]: error: kex_exchange_identification: read: Connection reset by peer
Oct 01 14:16:12 compute-0 sshd-session[221143]: Connection reset by 45.140.17.97 port 65089
Oct 01 14:16:13 compute-0 nova_compute[192698]: 2025-10-01 14:16:13.568 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:16:13 compute-0 nova_compute[192698]: 2025-10-01 14:16:13.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:16:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:16:14.262 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:16:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:16:14.262 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:16:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:16:14.262 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:16:14 compute-0 nova_compute[192698]: 2025-10-01 14:16:14.458 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:16:14 compute-0 nova_compute[192698]: 2025-10-01 14:16:14.458 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:16:14 compute-0 nova_compute[192698]: 2025-10-01 14:16:14.459 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:16:14 compute-0 nova_compute[192698]: 2025-10-01 14:16:14.459 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 01 14:16:14 compute-0 nova_compute[192698]: 2025-10-01 14:16:14.636 2 WARNING nova.virt.libvirt.driver [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:16:14 compute-0 nova_compute[192698]: 2025-10-01 14:16:14.640 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:16:14 compute-0 nova_compute[192698]: 2025-10-01 14:16:14.669 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.029s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:16:14 compute-0 nova_compute[192698]: 2025-10-01 14:16:14.670 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5844MB free_disk=73.30322265625GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 01 14:16:14 compute-0 nova_compute[192698]: 2025-10-01 14:16:14.670 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:16:14 compute-0 nova_compute[192698]: 2025-10-01 14:16:14.670 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:16:15 compute-0 nova_compute[192698]: 2025-10-01 14:16:15.361 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:16:15 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:16:15.513 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:79:7d:2f 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-311ee548-6614-4a7e-89b8-72a983d5ffb5', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-311ee548-6614-4a7e-89b8-72a983d5ffb5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '123cb2c9c5024f2da31f757a4443bdaa', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ed926afc-775b-4259-a276-a08c58ffcb3e, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=d4fc92e8-a8e1-4ac3-8260-4a9bc8aaf801) old=Port_Binding(mac=['fa:16:3e:79:7d:2f'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-311ee548-6614-4a7e-89b8-72a983d5ffb5', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-311ee548-6614-4a7e-89b8-72a983d5ffb5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '123cb2c9c5024f2da31f757a4443bdaa', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:16:15 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:16:15.513 103791 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port d4fc92e8-a8e1-4ac3-8260-4a9bc8aaf801 in datapath 311ee548-6614-4a7e-89b8-72a983d5ffb5 updated
Oct 01 14:16:15 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:16:15.515 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 311ee548-6614-4a7e-89b8-72a983d5ffb5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 01 14:16:15 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:16:15.516 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[da70bc52-1911-4e49-94ea-90da8cc20636]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:16:16 compute-0 nova_compute[192698]: 2025-10-01 14:16:16.072 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 01 14:16:16 compute-0 nova_compute[192698]: 2025-10-01 14:16:16.072 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:16:14 up  1:15,  0 user,  load average: 0.26, 0.25, 0.36\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 01 14:16:16 compute-0 nova_compute[192698]: 2025-10-01 14:16:16.112 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:16:16 compute-0 podman[221148]: 2025-10-01 14:16:16.187289316 +0000 UTC m=+0.084567942 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Oct 01 14:16:16 compute-0 podman[221147]: 2025-10-01 14:16:16.202043265 +0000 UTC m=+0.108713145 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_id=iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 01 14:16:16 compute-0 nova_compute[192698]: 2025-10-01 14:16:16.636 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:16:17 compute-0 nova_compute[192698]: 2025-10-01 14:16:17.232 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 01 14:16:17 compute-0 nova_compute[192698]: 2025-10-01 14:16:17.233 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.563s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:16:18 compute-0 nova_compute[192698]: 2025-10-01 14:16:18.570 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:16:20 compute-0 nova_compute[192698]: 2025-10-01 14:16:20.234 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:16:20 compute-0 nova_compute[192698]: 2025-10-01 14:16:20.234 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:16:20 compute-0 nova_compute[192698]: 2025-10-01 14:16:20.235 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:16:20 compute-0 nova_compute[192698]: 2025-10-01 14:16:20.364 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:16:20 compute-0 nova_compute[192698]: 2025-10-01 14:16:20.915 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:16:20 compute-0 nova_compute[192698]: 2025-10-01 14:16:20.924 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:16:22 compute-0 podman[221185]: 2025-10-01 14:16:22.16946918 +0000 UTC m=+0.074642912 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 01 14:16:23 compute-0 nova_compute[192698]: 2025-10-01 14:16:23.574 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:16:23 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:16:23.739 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ec:46:6d 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-0e7a975d-d721-4c1b-b5a4-d457134e2ca6', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0e7a975d-d721-4c1b-b5a4-d457134e2ca6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '205b0c79ab9944b09686b559cad9b199', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6f332375-7461-4a85-846d-0d1e4b0dd1d1, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=6898ac81-cacf-4d81-a623-b0afcea18a4a) old=Port_Binding(mac=['fa:16:3e:ec:46:6d'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-0e7a975d-d721-4c1b-b5a4-d457134e2ca6', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0e7a975d-d721-4c1b-b5a4-d457134e2ca6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '205b0c79ab9944b09686b559cad9b199', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:16:23 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:16:23.740 103791 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 6898ac81-cacf-4d81-a623-b0afcea18a4a in datapath 0e7a975d-d721-4c1b-b5a4-d457134e2ca6 updated
Oct 01 14:16:23 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:16:23.742 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0e7a975d-d721-4c1b-b5a4-d457134e2ca6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 01 14:16:23 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:16:23.743 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[6f3747ec-57b3-4057-9a0f-08f0d3bc83d8]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:16:23 compute-0 nova_compute[192698]: 2025-10-01 14:16:23.926 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:16:25 compute-0 nova_compute[192698]: 2025-10-01 14:16:25.367 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:16:25 compute-0 nova_compute[192698]: 2025-10-01 14:16:25.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:16:25 compute-0 nova_compute[192698]: 2025-10-01 14:16:25.926 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 01 14:16:28 compute-0 nova_compute[192698]: 2025-10-01 14:16:28.574 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:16:29 compute-0 podman[203144]: time="2025-10-01T14:16:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:16:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:16:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:16:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:16:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3024 "" "Go-http-client/1.1"
Oct 01 14:16:30 compute-0 nova_compute[192698]: 2025-10-01 14:16:30.369 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:16:31 compute-0 openstack_network_exporter[205307]: ERROR   14:16:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:16:31 compute-0 openstack_network_exporter[205307]: ERROR   14:16:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:16:31 compute-0 openstack_network_exporter[205307]: ERROR   14:16:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:16:31 compute-0 openstack_network_exporter[205307]: ERROR   14:16:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:16:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:16:31 compute-0 openstack_network_exporter[205307]: ERROR   14:16:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:16:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:16:33 compute-0 nova_compute[192698]: 2025-10-01 14:16:33.576 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:16:34 compute-0 podman[221209]: 2025-10-01 14:16:34.142193149 +0000 UTC m=+0.061464076 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct 01 14:16:34 compute-0 podman[221210]: 2025-10-01 14:16:34.208611428 +0000 UTC m=+0.120295449 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250930, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Oct 01 14:16:35 compute-0 nova_compute[192698]: 2025-10-01 14:16:35.372 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:16:36 compute-0 ovn_controller[94909]: 2025-10-01T14:16:36Z|00134|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Oct 01 14:16:38 compute-0 nova_compute[192698]: 2025-10-01 14:16:38.579 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:16:40 compute-0 nova_compute[192698]: 2025-10-01 14:16:40.373 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:16:42 compute-0 podman[221255]: 2025-10-01 14:16:42.178410321 +0000 UTC m=+0.089446744 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, distribution-scope=public, maintainer=Red Hat, Inc., release=1755695350, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, name=ubi9-minimal, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, container_name=openstack_network_exporter)
Oct 01 14:16:42 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:16:42.558 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:79:6c:81 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-031a8987-8430-4fb6-a464-01e4dca2fae7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5c9696bee230443aa9465a892b11ae6b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bd1914da-f1b0-4097-9d6b-24a3870871dc, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=6dd814dc-cba2-4392-85ef-eadb8c4615f7) old=Port_Binding(mac=['fa:16:3e:79:6c:81'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-031a8987-8430-4fb6-a464-01e4dca2fae7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5c9696bee230443aa9465a892b11ae6b', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:16:42 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:16:42.559 103791 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 6dd814dc-cba2-4392-85ef-eadb8c4615f7 in datapath 031a8987-8430-4fb6-a464-01e4dca2fae7 updated
Oct 01 14:16:42 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:16:42.560 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 031a8987-8430-4fb6-a464-01e4dca2fae7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 01 14:16:42 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:16:42.562 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[d73ce4c5-11a4-4f93-a08e-7769916521f0]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:16:43 compute-0 nova_compute[192698]: 2025-10-01 14:16:43.615 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:16:45 compute-0 nova_compute[192698]: 2025-10-01 14:16:45.376 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:16:47 compute-0 podman[221277]: 2025-10-01 14:16:47.157921138 +0000 UTC m=+0.068690472 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct 01 14:16:47 compute-0 podman[221276]: 2025-10-01 14:16:47.173383646 +0000 UTC m=+0.087356617 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=iscsid, container_name=iscsid, org.label-schema.license=GPLv2)
Oct 01 14:16:48 compute-0 nova_compute[192698]: 2025-10-01 14:16:48.616 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:16:50 compute-0 nova_compute[192698]: 2025-10-01 14:16:50.381 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:16:51 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:16:51.487 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:08:a0:26 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-c13fc77f-3bd6-491b-bbe4-1aa59ec5f1c9', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c13fc77f-3bd6-491b-bbe4-1aa59ec5f1c9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd43115e3729442e1b68b749acc0dabc8', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9732ac97-98fb-4d30-9b38-07334ec3182e, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=d50a2019-b5c6-47da-851c-8918b83d623e) old=Port_Binding(mac=['fa:16:3e:08:a0:26'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-c13fc77f-3bd6-491b-bbe4-1aa59ec5f1c9', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c13fc77f-3bd6-491b-bbe4-1aa59ec5f1c9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd43115e3729442e1b68b749acc0dabc8', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:16:51 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:16:51.489 103791 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port d50a2019-b5c6-47da-851c-8918b83d623e in datapath c13fc77f-3bd6-491b-bbe4-1aa59ec5f1c9 updated
Oct 01 14:16:51 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:16:51.490 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c13fc77f-3bd6-491b-bbe4-1aa59ec5f1c9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 01 14:16:51 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:16:51.491 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[9c49bc6a-7eb8-4b20-89e2-c03740d3710f]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:16:53 compute-0 podman[221312]: 2025-10-01 14:16:53.152646463 +0000 UTC m=+0.069984046 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 01 14:16:53 compute-0 nova_compute[192698]: 2025-10-01 14:16:53.670 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:16:55 compute-0 nova_compute[192698]: 2025-10-01 14:16:55.385 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:16:58 compute-0 nova_compute[192698]: 2025-10-01 14:16:58.673 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:16:59 compute-0 podman[203144]: time="2025-10-01T14:16:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:16:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:16:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:16:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:16:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3024 "" "Go-http-client/1.1"
Oct 01 14:17:00 compute-0 nova_compute[192698]: 2025-10-01 14:17:00.413 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:17:01 compute-0 openstack_network_exporter[205307]: ERROR   14:17:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:17:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:17:01 compute-0 openstack_network_exporter[205307]: ERROR   14:17:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:17:01 compute-0 openstack_network_exporter[205307]: ERROR   14:17:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:17:01 compute-0 openstack_network_exporter[205307]: ERROR   14:17:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:17:01 compute-0 openstack_network_exporter[205307]: ERROR   14:17:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:17:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:17:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:17:03.528 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'e2:3f:3c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:1d:a6:67:ed:e6'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:17:03 compute-0 nova_compute[192698]: 2025-10-01 14:17:03.528 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:17:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:17:03.529 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 01 14:17:03 compute-0 nova_compute[192698]: 2025-10-01 14:17:03.675 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:17:05 compute-0 podman[221339]: 2025-10-01 14:17:05.176402914 +0000 UTC m=+0.083731488 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4)
Oct 01 14:17:05 compute-0 podman[221340]: 2025-10-01 14:17:05.214072174 +0000 UTC m=+0.123941447 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Oct 01 14:17:05 compute-0 nova_compute[192698]: 2025-10-01 14:17:05.449 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:17:08 compute-0 nova_compute[192698]: 2025-10-01 14:17:08.678 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:17:10 compute-0 nova_compute[192698]: 2025-10-01 14:17:10.452 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:17:10 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:17:10.531 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=10cf9814-09fa-4bad-879a-270f9b64eda3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:17:13 compute-0 podman[221386]: 2025-10-01 14:17:13.181136504 +0000 UTC m=+0.091106069 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, distribution-scope=public, config_id=edpm, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, architecture=x86_64, com.redhat.component=ubi9-minimal-container, release=1755695350, io.openshift.expose-services=, managed_by=edpm_ansible, name=ubi9-minimal, io.openshift.tags=minimal rhel9, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct 01 14:17:13 compute-0 nova_compute[192698]: 2025-10-01 14:17:13.715 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:17:13 compute-0 nova_compute[192698]: 2025-10-01 14:17:13.926 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:17:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:17:14.263 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:17:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:17:14.264 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:17:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:17:14.264 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:17:14 compute-0 nova_compute[192698]: 2025-10-01 14:17:14.488 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:17:14 compute-0 nova_compute[192698]: 2025-10-01 14:17:14.489 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:17:14 compute-0 nova_compute[192698]: 2025-10-01 14:17:14.490 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:17:14 compute-0 nova_compute[192698]: 2025-10-01 14:17:14.490 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 01 14:17:14 compute-0 nova_compute[192698]: 2025-10-01 14:17:14.686 2 WARNING nova.virt.libvirt.driver [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:17:14 compute-0 nova_compute[192698]: 2025-10-01 14:17:14.688 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:17:14 compute-0 nova_compute[192698]: 2025-10-01 14:17:14.729 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:17:14 compute-0 nova_compute[192698]: 2025-10-01 14:17:14.730 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5848MB free_disk=73.30322265625GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 01 14:17:14 compute-0 nova_compute[192698]: 2025-10-01 14:17:14.730 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:17:14 compute-0 nova_compute[192698]: 2025-10-01 14:17:14.730 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:17:15 compute-0 nova_compute[192698]: 2025-10-01 14:17:15.455 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:17:15 compute-0 nova_compute[192698]: 2025-10-01 14:17:15.785 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 01 14:17:15 compute-0 nova_compute[192698]: 2025-10-01 14:17:15.786 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:17:14 up  1:16,  0 user,  load average: 0.09, 0.20, 0.34\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 01 14:17:15 compute-0 nova_compute[192698]: 2025-10-01 14:17:15.808 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:17:16 compute-0 nova_compute[192698]: 2025-10-01 14:17:16.318 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:17:16 compute-0 nova_compute[192698]: 2025-10-01 14:17:16.831 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 01 14:17:16 compute-0 nova_compute[192698]: 2025-10-01 14:17:16.831 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.101s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:17:18 compute-0 podman[221409]: 2025-10-01 14:17:18.163389635 +0000 UTC m=+0.068201378 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct 01 14:17:18 compute-0 podman[221410]: 2025-10-01 14:17:18.181246258 +0000 UTC m=+0.084031197 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Oct 01 14:17:18 compute-0 nova_compute[192698]: 2025-10-01 14:17:18.719 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:17:18 compute-0 nova_compute[192698]: 2025-10-01 14:17:18.777 2 DEBUG oslo_concurrency.lockutils [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Acquiring lock "aad5638f-3b4c-43c9-a453-2cd987bcc593" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:17:18 compute-0 nova_compute[192698]: 2025-10-01 14:17:18.778 2 DEBUG oslo_concurrency.lockutils [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "aad5638f-3b4c-43c9-a453-2cd987bcc593" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:17:19 compute-0 nova_compute[192698]: 2025-10-01 14:17:19.287 2 DEBUG nova.compute.manager [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Oct 01 14:17:19 compute-0 nova_compute[192698]: 2025-10-01 14:17:19.832 2 DEBUG oslo_concurrency.lockutils [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:17:19 compute-0 nova_compute[192698]: 2025-10-01 14:17:19.833 2 DEBUG oslo_concurrency.lockutils [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:17:19 compute-0 nova_compute[192698]: 2025-10-01 14:17:19.841 2 DEBUG nova.virt.hardware [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Oct 01 14:17:19 compute-0 nova_compute[192698]: 2025-10-01 14:17:19.841 2 INFO nova.compute.claims [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Claim successful on node compute-0.ctlplane.example.com
Oct 01 14:17:20 compute-0 nova_compute[192698]: 2025-10-01 14:17:20.458 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:17:20 compute-0 nova_compute[192698]: 2025-10-01 14:17:20.831 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:17:20 compute-0 nova_compute[192698]: 2025-10-01 14:17:20.831 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:17:20 compute-0 nova_compute[192698]: 2025-10-01 14:17:20.909 2 DEBUG nova.compute.provider_tree [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:17:20 compute-0 nova_compute[192698]: 2025-10-01 14:17:20.924 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:17:21 compute-0 nova_compute[192698]: 2025-10-01 14:17:21.416 2 DEBUG nova.scheduler.client.report [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:17:21 compute-0 nova_compute[192698]: 2025-10-01 14:17:21.914 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:17:21 compute-0 nova_compute[192698]: 2025-10-01 14:17:21.936 2 DEBUG oslo_concurrency.lockutils [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.103s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:17:21 compute-0 nova_compute[192698]: 2025-10-01 14:17:21.936 2 DEBUG nova.compute.manager [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Oct 01 14:17:22 compute-0 nova_compute[192698]: 2025-10-01 14:17:22.434 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:17:22 compute-0 nova_compute[192698]: 2025-10-01 14:17:22.461 2 DEBUG nova.compute.manager [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Oct 01 14:17:22 compute-0 nova_compute[192698]: 2025-10-01 14:17:22.462 2 DEBUG nova.network.neutron [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Oct 01 14:17:22 compute-0 nova_compute[192698]: 2025-10-01 14:17:22.463 2 WARNING neutronclient.v2_0.client [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:17:22 compute-0 nova_compute[192698]: 2025-10-01 14:17:22.464 2 WARNING neutronclient.v2_0.client [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:17:22 compute-0 nova_compute[192698]: 2025-10-01 14:17:22.974 2 INFO nova.virt.libvirt.driver [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 01 14:17:23 compute-0 nova_compute[192698]: 2025-10-01 14:17:23.434 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:17:23 compute-0 nova_compute[192698]: 2025-10-01 14:17:23.481 2 DEBUG nova.compute.manager [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Oct 01 14:17:23 compute-0 nova_compute[192698]: 2025-10-01 14:17:23.721 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:17:23 compute-0 nova_compute[192698]: 2025-10-01 14:17:23.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:17:24 compute-0 podman[221446]: 2025-10-01 14:17:24.177195137 +0000 UTC m=+0.084013967 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 01 14:17:24 compute-0 nova_compute[192698]: 2025-10-01 14:17:24.308 2 DEBUG nova.network.neutron [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Successfully created port: 23321592-5912-475c-80cc-9fe5944d128d _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Oct 01 14:17:24 compute-0 nova_compute[192698]: 2025-10-01 14:17:24.500 2 DEBUG nova.compute.manager [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Oct 01 14:17:24 compute-0 nova_compute[192698]: 2025-10-01 14:17:24.502 2 DEBUG nova.virt.libvirt.driver [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Oct 01 14:17:24 compute-0 nova_compute[192698]: 2025-10-01 14:17:24.503 2 INFO nova.virt.libvirt.driver [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Creating image(s)
Oct 01 14:17:24 compute-0 nova_compute[192698]: 2025-10-01 14:17:24.504 2 DEBUG oslo_concurrency.lockutils [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Acquiring lock "/var/lib/nova/instances/aad5638f-3b4c-43c9-a453-2cd987bcc593/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:17:24 compute-0 nova_compute[192698]: 2025-10-01 14:17:24.504 2 DEBUG oslo_concurrency.lockutils [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "/var/lib/nova/instances/aad5638f-3b4c-43c9-a453-2cd987bcc593/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:17:24 compute-0 nova_compute[192698]: 2025-10-01 14:17:24.505 2 DEBUG oslo_concurrency.lockutils [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "/var/lib/nova/instances/aad5638f-3b4c-43c9-a453-2cd987bcc593/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:17:24 compute-0 nova_compute[192698]: 2025-10-01 14:17:24.506 2 DEBUG oslo_utils.imageutils.format_inspector [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:17:24 compute-0 nova_compute[192698]: 2025-10-01 14:17:24.513 2 DEBUG oslo_utils.imageutils.format_inspector [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:17:24 compute-0 nova_compute[192698]: 2025-10-01 14:17:24.521 2 DEBUG oslo_concurrency.processutils [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:17:24 compute-0 nova_compute[192698]: 2025-10-01 14:17:24.605 2 DEBUG oslo_concurrency.processutils [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:17:24 compute-0 nova_compute[192698]: 2025-10-01 14:17:24.607 2 DEBUG oslo_concurrency.lockutils [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Acquiring lock "f477473ce09fdc00484ca839f539813eb2fee546" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:17:24 compute-0 nova_compute[192698]: 2025-10-01 14:17:24.608 2 DEBUG oslo_concurrency.lockutils [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "f477473ce09fdc00484ca839f539813eb2fee546" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:17:24 compute-0 nova_compute[192698]: 2025-10-01 14:17:24.609 2 DEBUG oslo_utils.imageutils.format_inspector [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:17:24 compute-0 nova_compute[192698]: 2025-10-01 14:17:24.614 2 DEBUG oslo_utils.imageutils.format_inspector [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:17:24 compute-0 nova_compute[192698]: 2025-10-01 14:17:24.615 2 DEBUG oslo_concurrency.processutils [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:17:24 compute-0 nova_compute[192698]: 2025-10-01 14:17:24.702 2 DEBUG oslo_concurrency.processutils [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:17:24 compute-0 nova_compute[192698]: 2025-10-01 14:17:24.704 2 DEBUG oslo_concurrency.processutils [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546,backing_fmt=raw /var/lib/nova/instances/aad5638f-3b4c-43c9-a453-2cd987bcc593/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:17:24 compute-0 nova_compute[192698]: 2025-10-01 14:17:24.755 2 DEBUG oslo_concurrency.processutils [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546,backing_fmt=raw /var/lib/nova/instances/aad5638f-3b4c-43c9-a453-2cd987bcc593/disk 1073741824" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:17:24 compute-0 nova_compute[192698]: 2025-10-01 14:17:24.756 2 DEBUG oslo_concurrency.lockutils [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "f477473ce09fdc00484ca839f539813eb2fee546" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.149s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:17:24 compute-0 nova_compute[192698]: 2025-10-01 14:17:24.757 2 DEBUG oslo_concurrency.processutils [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:17:24 compute-0 nova_compute[192698]: 2025-10-01 14:17:24.861 2 DEBUG oslo_concurrency.processutils [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.104s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:17:24 compute-0 nova_compute[192698]: 2025-10-01 14:17:24.863 2 DEBUG nova.virt.disk.api [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Checking if we can resize image /var/lib/nova/instances/aad5638f-3b4c-43c9-a453-2cd987bcc593/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 01 14:17:24 compute-0 nova_compute[192698]: 2025-10-01 14:17:24.864 2 DEBUG oslo_concurrency.processutils [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/aad5638f-3b4c-43c9-a453-2cd987bcc593/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:17:24 compute-0 nova_compute[192698]: 2025-10-01 14:17:24.947 2 DEBUG oslo_concurrency.processutils [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/aad5638f-3b4c-43c9-a453-2cd987bcc593/disk --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:17:24 compute-0 nova_compute[192698]: 2025-10-01 14:17:24.949 2 DEBUG nova.virt.disk.api [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Cannot resize image /var/lib/nova/instances/aad5638f-3b4c-43c9-a453-2cd987bcc593/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 01 14:17:24 compute-0 nova_compute[192698]: 2025-10-01 14:17:24.950 2 DEBUG nova.virt.libvirt.driver [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Oct 01 14:17:24 compute-0 nova_compute[192698]: 2025-10-01 14:17:24.950 2 DEBUG nova.virt.libvirt.driver [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Ensure instance console log exists: /var/lib/nova/instances/aad5638f-3b4c-43c9-a453-2cd987bcc593/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Oct 01 14:17:24 compute-0 nova_compute[192698]: 2025-10-01 14:17:24.951 2 DEBUG oslo_concurrency.lockutils [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:17:24 compute-0 nova_compute[192698]: 2025-10-01 14:17:24.952 2 DEBUG oslo_concurrency.lockutils [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:17:24 compute-0 nova_compute[192698]: 2025-10-01 14:17:24.952 2 DEBUG oslo_concurrency.lockutils [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:17:25 compute-0 nova_compute[192698]: 2025-10-01 14:17:25.142 2 DEBUG nova.network.neutron [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Successfully updated port: 23321592-5912-475c-80cc-9fe5944d128d _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Oct 01 14:17:25 compute-0 nova_compute[192698]: 2025-10-01 14:17:25.208 2 DEBUG nova.compute.manager [req-049e1d27-59d7-4424-9a95-573ff0a5ee28 req-0d83baf6-970d-4c60-a4ec-ffb9f832340e 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Received event network-changed-23321592-5912-475c-80cc-9fe5944d128d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:17:25 compute-0 nova_compute[192698]: 2025-10-01 14:17:25.208 2 DEBUG nova.compute.manager [req-049e1d27-59d7-4424-9a95-573ff0a5ee28 req-0d83baf6-970d-4c60-a4ec-ffb9f832340e 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Refreshing instance network info cache due to event network-changed-23321592-5912-475c-80cc-9fe5944d128d. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Oct 01 14:17:25 compute-0 nova_compute[192698]: 2025-10-01 14:17:25.209 2 DEBUG oslo_concurrency.lockutils [req-049e1d27-59d7-4424-9a95-573ff0a5ee28 req-0d83baf6-970d-4c60-a4ec-ffb9f832340e 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "refresh_cache-aad5638f-3b4c-43c9-a453-2cd987bcc593" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 01 14:17:25 compute-0 nova_compute[192698]: 2025-10-01 14:17:25.209 2 DEBUG oslo_concurrency.lockutils [req-049e1d27-59d7-4424-9a95-573ff0a5ee28 req-0d83baf6-970d-4c60-a4ec-ffb9f832340e 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquired lock "refresh_cache-aad5638f-3b4c-43c9-a453-2cd987bcc593" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 01 14:17:25 compute-0 nova_compute[192698]: 2025-10-01 14:17:25.210 2 DEBUG nova.network.neutron [req-049e1d27-59d7-4424-9a95-573ff0a5ee28 req-0d83baf6-970d-4c60-a4ec-ffb9f832340e 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Refreshing network info cache for port 23321592-5912-475c-80cc-9fe5944d128d _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Oct 01 14:17:25 compute-0 nova_compute[192698]: 2025-10-01 14:17:25.460 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:17:25 compute-0 nova_compute[192698]: 2025-10-01 14:17:25.656 2 DEBUG oslo_concurrency.lockutils [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Acquiring lock "refresh_cache-aad5638f-3b4c-43c9-a453-2cd987bcc593" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 01 14:17:25 compute-0 nova_compute[192698]: 2025-10-01 14:17:25.717 2 WARNING neutronclient.v2_0.client [req-049e1d27-59d7-4424-9a95-573ff0a5ee28 req-0d83baf6-970d-4c60-a4ec-ffb9f832340e 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:17:25 compute-0 nova_compute[192698]: 2025-10-01 14:17:25.933 2 DEBUG nova.network.neutron [req-049e1d27-59d7-4424-9a95-573ff0a5ee28 req-0d83baf6-970d-4c60-a4ec-ffb9f832340e 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 01 14:17:26 compute-0 nova_compute[192698]: 2025-10-01 14:17:26.084 2 DEBUG nova.network.neutron [req-049e1d27-59d7-4424-9a95-573ff0a5ee28 req-0d83baf6-970d-4c60-a4ec-ffb9f832340e 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:17:26 compute-0 nova_compute[192698]: 2025-10-01 14:17:26.591 2 DEBUG oslo_concurrency.lockutils [req-049e1d27-59d7-4424-9a95-573ff0a5ee28 req-0d83baf6-970d-4c60-a4ec-ffb9f832340e 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Releasing lock "refresh_cache-aad5638f-3b4c-43c9-a453-2cd987bcc593" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 01 14:17:26 compute-0 nova_compute[192698]: 2025-10-01 14:17:26.592 2 DEBUG oslo_concurrency.lockutils [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Acquired lock "refresh_cache-aad5638f-3b4c-43c9-a453-2cd987bcc593" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 01 14:17:26 compute-0 nova_compute[192698]: 2025-10-01 14:17:26.592 2 DEBUG nova.network.neutron [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 01 14:17:26 compute-0 nova_compute[192698]: 2025-10-01 14:17:26.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:17:26 compute-0 nova_compute[192698]: 2025-10-01 14:17:26.925 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 01 14:17:27 compute-0 nova_compute[192698]: 2025-10-01 14:17:27.248 2 DEBUG nova.network.neutron [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 01 14:17:27 compute-0 nova_compute[192698]: 2025-10-01 14:17:27.461 2 WARNING neutronclient.v2_0.client [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:17:27 compute-0 nova_compute[192698]: 2025-10-01 14:17:27.593 2 DEBUG nova.network.neutron [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Updating instance_info_cache with network_info: [{"id": "23321592-5912-475c-80cc-9fe5944d128d", "address": "fa:16:3e:63:24:e1", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23321592-59", "ovs_interfaceid": "23321592-5912-475c-80cc-9fe5944d128d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:17:28 compute-0 nova_compute[192698]: 2025-10-01 14:17:28.101 2 DEBUG oslo_concurrency.lockutils [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Releasing lock "refresh_cache-aad5638f-3b4c-43c9-a453-2cd987bcc593" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 01 14:17:28 compute-0 nova_compute[192698]: 2025-10-01 14:17:28.102 2 DEBUG nova.compute.manager [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Instance network_info: |[{"id": "23321592-5912-475c-80cc-9fe5944d128d", "address": "fa:16:3e:63:24:e1", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23321592-59", "ovs_interfaceid": "23321592-5912-475c-80cc-9fe5944d128d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Oct 01 14:17:28 compute-0 nova_compute[192698]: 2025-10-01 14:17:28.105 2 DEBUG nova.virt.libvirt.driver [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Start _get_guest_xml network_info=[{"id": "23321592-5912-475c-80cc-9fe5944d128d", "address": "fa:16:3e:63:24:e1", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23321592-59", "ovs_interfaceid": "23321592-5912-475c-80cc-9fe5944d128d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-01T13:57:39Z,direct_url=<?>,disk_format='qcow2',id=48696e9b-a20d-4bf6-8ac2-6438fe748ab6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9dacac6049d34f02846f752af09ae16f',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-01T13:57:40Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encrypted': False, 'encryption_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_options': None, 'device_name': '/dev/vda', 'guest_format': None, 'image_id': '48696e9b-a20d-4bf6-8ac2-6438fe748ab6'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Oct 01 14:17:28 compute-0 nova_compute[192698]: 2025-10-01 14:17:28.111 2 WARNING nova.virt.libvirt.driver [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:17:28 compute-0 nova_compute[192698]: 2025-10-01 14:17:28.113 2 DEBUG nova.virt.driver [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='48696e9b-a20d-4bf6-8ac2-6438fe748ab6', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteStrategies-server-624439701', uuid='aad5638f-3b4c-43c9-a453-2cd987bcc593'), owner=OwnerMeta(userid='f8897741e6ca4770b56d28d05fa3fc42', username='tempest-TestExecuteStrategies-30131345-project-admin', projectid='d43115e3729442e1b68b749acc0dabc8', projectname='tempest-TestExecuteStrategies-30131345'), image=ImageMeta(id='48696e9b-a20d-4bf6-8ac2-6438fe748ab6', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='69702c4b-38f2-49d1-96d5-85671652c67e', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "23321592-5912-475c-80cc-9fe5944d128d", "address": "fa:16:3e:63:24:e1", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23321592-59", "ovs_interfaceid": "23321592-5912-475c-80cc-9fe5944d128d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759328248.1135786) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Oct 01 14:17:28 compute-0 nova_compute[192698]: 2025-10-01 14:17:28.118 2 DEBUG nova.virt.libvirt.host [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Oct 01 14:17:28 compute-0 nova_compute[192698]: 2025-10-01 14:17:28.119 2 DEBUG nova.virt.libvirt.host [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Oct 01 14:17:28 compute-0 nova_compute[192698]: 2025-10-01 14:17:28.123 2 DEBUG nova.virt.libvirt.host [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Oct 01 14:17:28 compute-0 nova_compute[192698]: 2025-10-01 14:17:28.124 2 DEBUG nova.virt.libvirt.host [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Oct 01 14:17:28 compute-0 nova_compute[192698]: 2025-10-01 14:17:28.125 2 DEBUG nova.virt.libvirt.driver [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Oct 01 14:17:28 compute-0 nova_compute[192698]: 2025-10-01 14:17:28.125 2 DEBUG nova.virt.hardware [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-01T13:57:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='69702c4b-38f2-49d1-96d5-85671652c67e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-01T13:57:39Z,direct_url=<?>,disk_format='qcow2',id=48696e9b-a20d-4bf6-8ac2-6438fe748ab6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9dacac6049d34f02846f752af09ae16f',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-01T13:57:40Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Oct 01 14:17:28 compute-0 nova_compute[192698]: 2025-10-01 14:17:28.126 2 DEBUG nova.virt.hardware [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Oct 01 14:17:28 compute-0 nova_compute[192698]: 2025-10-01 14:17:28.126 2 DEBUG nova.virt.hardware [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Oct 01 14:17:28 compute-0 nova_compute[192698]: 2025-10-01 14:17:28.127 2 DEBUG nova.virt.hardware [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Oct 01 14:17:28 compute-0 nova_compute[192698]: 2025-10-01 14:17:28.127 2 DEBUG nova.virt.hardware [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Oct 01 14:17:28 compute-0 nova_compute[192698]: 2025-10-01 14:17:28.128 2 DEBUG nova.virt.hardware [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Oct 01 14:17:28 compute-0 nova_compute[192698]: 2025-10-01 14:17:28.128 2 DEBUG nova.virt.hardware [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Oct 01 14:17:28 compute-0 nova_compute[192698]: 2025-10-01 14:17:28.129 2 DEBUG nova.virt.hardware [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Oct 01 14:17:28 compute-0 nova_compute[192698]: 2025-10-01 14:17:28.129 2 DEBUG nova.virt.hardware [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Oct 01 14:17:28 compute-0 nova_compute[192698]: 2025-10-01 14:17:28.130 2 DEBUG nova.virt.hardware [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Oct 01 14:17:28 compute-0 nova_compute[192698]: 2025-10-01 14:17:28.130 2 DEBUG nova.virt.hardware [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Oct 01 14:17:28 compute-0 nova_compute[192698]: 2025-10-01 14:17:28.137 2 DEBUG nova.virt.libvirt.vif [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-01T14:17:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-624439701',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-624439701',id=17,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d43115e3729442e1b68b749acc0dabc8',ramdisk_id='',reservation_id='r-l3gig426',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-30131345',owner_user_name='tempest-TestExecuteStrategies-30131345-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-01T14:17:23Z,user_data=None,user_id='f8897741e6ca4770b56d28d05fa3fc42',uuid=aad5638f-3b4c-43c9-a453-2cd987bcc593,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "23321592-5912-475c-80cc-9fe5944d128d", "address": "fa:16:3e:63:24:e1", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23321592-59", "ovs_interfaceid": "23321592-5912-475c-80cc-9fe5944d128d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Oct 01 14:17:28 compute-0 nova_compute[192698]: 2025-10-01 14:17:28.138 2 DEBUG nova.network.os_vif_util [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Converting VIF {"id": "23321592-5912-475c-80cc-9fe5944d128d", "address": "fa:16:3e:63:24:e1", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23321592-59", "ovs_interfaceid": "23321592-5912-475c-80cc-9fe5944d128d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:17:28 compute-0 nova_compute[192698]: 2025-10-01 14:17:28.139 2 DEBUG nova.network.os_vif_util [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:63:24:e1,bridge_name='br-int',has_traffic_filtering=True,id=23321592-5912-475c-80cc-9fe5944d128d,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap23321592-59') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:17:28 compute-0 nova_compute[192698]: 2025-10-01 14:17:28.141 2 DEBUG nova.objects.instance [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lazy-loading 'pci_devices' on Instance uuid aad5638f-3b4c-43c9-a453-2cd987bcc593 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:17:28 compute-0 nova_compute[192698]: 2025-10-01 14:17:28.650 2 DEBUG nova.virt.libvirt.driver [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] End _get_guest_xml xml=<domain type="kvm">
Oct 01 14:17:28 compute-0 nova_compute[192698]:   <uuid>aad5638f-3b4c-43c9-a453-2cd987bcc593</uuid>
Oct 01 14:17:28 compute-0 nova_compute[192698]:   <name>instance-00000011</name>
Oct 01 14:17:28 compute-0 nova_compute[192698]:   <memory>131072</memory>
Oct 01 14:17:28 compute-0 nova_compute[192698]:   <vcpu>1</vcpu>
Oct 01 14:17:28 compute-0 nova_compute[192698]:   <metadata>
Oct 01 14:17:28 compute-0 nova_compute[192698]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 01 14:17:28 compute-0 nova_compute[192698]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Oct 01 14:17:28 compute-0 nova_compute[192698]:       <nova:name>tempest-TestExecuteStrategies-server-624439701</nova:name>
Oct 01 14:17:28 compute-0 nova_compute[192698]:       <nova:creationTime>2025-10-01 14:17:28</nova:creationTime>
Oct 01 14:17:28 compute-0 nova_compute[192698]:       <nova:flavor name="m1.nano" id="69702c4b-38f2-49d1-96d5-85671652c67e">
Oct 01 14:17:28 compute-0 nova_compute[192698]:         <nova:memory>128</nova:memory>
Oct 01 14:17:28 compute-0 nova_compute[192698]:         <nova:disk>1</nova:disk>
Oct 01 14:17:28 compute-0 nova_compute[192698]:         <nova:swap>0</nova:swap>
Oct 01 14:17:28 compute-0 nova_compute[192698]:         <nova:ephemeral>0</nova:ephemeral>
Oct 01 14:17:28 compute-0 nova_compute[192698]:         <nova:vcpus>1</nova:vcpus>
Oct 01 14:17:28 compute-0 nova_compute[192698]:         <nova:extraSpecs>
Oct 01 14:17:28 compute-0 nova_compute[192698]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Oct 01 14:17:28 compute-0 nova_compute[192698]:         </nova:extraSpecs>
Oct 01 14:17:28 compute-0 nova_compute[192698]:       </nova:flavor>
Oct 01 14:17:28 compute-0 nova_compute[192698]:       <nova:image uuid="48696e9b-a20d-4bf6-8ac2-6438fe748ab6">
Oct 01 14:17:28 compute-0 nova_compute[192698]:         <nova:containerFormat>bare</nova:containerFormat>
Oct 01 14:17:28 compute-0 nova_compute[192698]:         <nova:diskFormat>qcow2</nova:diskFormat>
Oct 01 14:17:28 compute-0 nova_compute[192698]:         <nova:minDisk>1</nova:minDisk>
Oct 01 14:17:28 compute-0 nova_compute[192698]:         <nova:minRam>0</nova:minRam>
Oct 01 14:17:28 compute-0 nova_compute[192698]:         <nova:properties>
Oct 01 14:17:28 compute-0 nova_compute[192698]:           <nova:property name="hw_rng_model">virtio</nova:property>
Oct 01 14:17:28 compute-0 nova_compute[192698]:         </nova:properties>
Oct 01 14:17:28 compute-0 nova_compute[192698]:       </nova:image>
Oct 01 14:17:28 compute-0 nova_compute[192698]:       <nova:owner>
Oct 01 14:17:28 compute-0 nova_compute[192698]:         <nova:user uuid="f8897741e6ca4770b56d28d05fa3fc42">tempest-TestExecuteStrategies-30131345-project-admin</nova:user>
Oct 01 14:17:28 compute-0 nova_compute[192698]:         <nova:project uuid="d43115e3729442e1b68b749acc0dabc8">tempest-TestExecuteStrategies-30131345</nova:project>
Oct 01 14:17:28 compute-0 nova_compute[192698]:       </nova:owner>
Oct 01 14:17:28 compute-0 nova_compute[192698]:       <nova:root type="image" uuid="48696e9b-a20d-4bf6-8ac2-6438fe748ab6"/>
Oct 01 14:17:28 compute-0 nova_compute[192698]:       <nova:ports>
Oct 01 14:17:28 compute-0 nova_compute[192698]:         <nova:port uuid="23321592-5912-475c-80cc-9fe5944d128d">
Oct 01 14:17:28 compute-0 nova_compute[192698]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Oct 01 14:17:28 compute-0 nova_compute[192698]:         </nova:port>
Oct 01 14:17:28 compute-0 nova_compute[192698]:       </nova:ports>
Oct 01 14:17:28 compute-0 nova_compute[192698]:     </nova:instance>
Oct 01 14:17:28 compute-0 nova_compute[192698]:   </metadata>
Oct 01 14:17:28 compute-0 nova_compute[192698]:   <sysinfo type="smbios">
Oct 01 14:17:28 compute-0 nova_compute[192698]:     <system>
Oct 01 14:17:28 compute-0 nova_compute[192698]:       <entry name="manufacturer">RDO</entry>
Oct 01 14:17:28 compute-0 nova_compute[192698]:       <entry name="product">OpenStack Compute</entry>
Oct 01 14:17:28 compute-0 nova_compute[192698]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Oct 01 14:17:28 compute-0 nova_compute[192698]:       <entry name="serial">aad5638f-3b4c-43c9-a453-2cd987bcc593</entry>
Oct 01 14:17:28 compute-0 nova_compute[192698]:       <entry name="uuid">aad5638f-3b4c-43c9-a453-2cd987bcc593</entry>
Oct 01 14:17:28 compute-0 nova_compute[192698]:       <entry name="family">Virtual Machine</entry>
Oct 01 14:17:28 compute-0 nova_compute[192698]:     </system>
Oct 01 14:17:28 compute-0 nova_compute[192698]:   </sysinfo>
Oct 01 14:17:28 compute-0 nova_compute[192698]:   <os>
Oct 01 14:17:28 compute-0 nova_compute[192698]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 01 14:17:28 compute-0 nova_compute[192698]:     <boot dev="hd"/>
Oct 01 14:17:28 compute-0 nova_compute[192698]:     <smbios mode="sysinfo"/>
Oct 01 14:17:28 compute-0 nova_compute[192698]:   </os>
Oct 01 14:17:28 compute-0 nova_compute[192698]:   <features>
Oct 01 14:17:28 compute-0 nova_compute[192698]:     <acpi/>
Oct 01 14:17:28 compute-0 nova_compute[192698]:     <apic/>
Oct 01 14:17:28 compute-0 nova_compute[192698]:     <vmcoreinfo/>
Oct 01 14:17:28 compute-0 nova_compute[192698]:   </features>
Oct 01 14:17:28 compute-0 nova_compute[192698]:   <clock offset="utc">
Oct 01 14:17:28 compute-0 nova_compute[192698]:     <timer name="pit" tickpolicy="delay"/>
Oct 01 14:17:28 compute-0 nova_compute[192698]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 01 14:17:28 compute-0 nova_compute[192698]:     <timer name="hpet" present="no"/>
Oct 01 14:17:28 compute-0 nova_compute[192698]:   </clock>
Oct 01 14:17:28 compute-0 nova_compute[192698]:   <cpu mode="host-model" match="exact">
Oct 01 14:17:28 compute-0 nova_compute[192698]:     <topology sockets="1" cores="1" threads="1"/>
Oct 01 14:17:28 compute-0 nova_compute[192698]:   </cpu>
Oct 01 14:17:28 compute-0 nova_compute[192698]:   <devices>
Oct 01 14:17:28 compute-0 nova_compute[192698]:     <disk type="file" device="disk">
Oct 01 14:17:28 compute-0 nova_compute[192698]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 01 14:17:28 compute-0 nova_compute[192698]:       <source file="/var/lib/nova/instances/aad5638f-3b4c-43c9-a453-2cd987bcc593/disk"/>
Oct 01 14:17:28 compute-0 nova_compute[192698]:       <target dev="vda" bus="virtio"/>
Oct 01 14:17:28 compute-0 nova_compute[192698]:     </disk>
Oct 01 14:17:28 compute-0 nova_compute[192698]:     <disk type="file" device="cdrom">
Oct 01 14:17:28 compute-0 nova_compute[192698]:       <driver name="qemu" type="raw" cache="none"/>
Oct 01 14:17:28 compute-0 nova_compute[192698]:       <source file="/var/lib/nova/instances/aad5638f-3b4c-43c9-a453-2cd987bcc593/disk.config"/>
Oct 01 14:17:28 compute-0 nova_compute[192698]:       <target dev="sda" bus="sata"/>
Oct 01 14:17:28 compute-0 nova_compute[192698]:     </disk>
Oct 01 14:17:28 compute-0 nova_compute[192698]:     <interface type="ethernet">
Oct 01 14:17:28 compute-0 nova_compute[192698]:       <mac address="fa:16:3e:63:24:e1"/>
Oct 01 14:17:28 compute-0 nova_compute[192698]:       <model type="virtio"/>
Oct 01 14:17:28 compute-0 nova_compute[192698]:       <driver name="vhost" rx_queue_size="512"/>
Oct 01 14:17:28 compute-0 nova_compute[192698]:       <mtu size="1442"/>
Oct 01 14:17:28 compute-0 nova_compute[192698]:       <target dev="tap23321592-59"/>
Oct 01 14:17:28 compute-0 nova_compute[192698]:     </interface>
Oct 01 14:17:28 compute-0 nova_compute[192698]:     <serial type="pty">
Oct 01 14:17:28 compute-0 nova_compute[192698]:       <log file="/var/lib/nova/instances/aad5638f-3b4c-43c9-a453-2cd987bcc593/console.log" append="off"/>
Oct 01 14:17:28 compute-0 nova_compute[192698]:     </serial>
Oct 01 14:17:28 compute-0 nova_compute[192698]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 01 14:17:28 compute-0 nova_compute[192698]:     <video>
Oct 01 14:17:28 compute-0 nova_compute[192698]:       <model type="virtio"/>
Oct 01 14:17:28 compute-0 nova_compute[192698]:     </video>
Oct 01 14:17:28 compute-0 nova_compute[192698]:     <input type="tablet" bus="usb"/>
Oct 01 14:17:28 compute-0 nova_compute[192698]:     <rng model="virtio">
Oct 01 14:17:28 compute-0 nova_compute[192698]:       <backend model="random">/dev/urandom</backend>
Oct 01 14:17:28 compute-0 nova_compute[192698]:     </rng>
Oct 01 14:17:28 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root"/>
Oct 01 14:17:28 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:17:28 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:17:28 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:17:28 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:17:28 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:17:28 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:17:28 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:17:28 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:17:28 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:17:28 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:17:28 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:17:28 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:17:28 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:17:28 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:17:28 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:17:28 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:17:28 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:17:28 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:17:28 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:17:28 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:17:28 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:17:28 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:17:28 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:17:28 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:17:28 compute-0 nova_compute[192698]:     <controller type="usb" index="0"/>
Oct 01 14:17:28 compute-0 nova_compute[192698]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Oct 01 14:17:28 compute-0 nova_compute[192698]:       <stats period="10"/>
Oct 01 14:17:28 compute-0 nova_compute[192698]:     </memballoon>
Oct 01 14:17:28 compute-0 nova_compute[192698]:   </devices>
Oct 01 14:17:28 compute-0 nova_compute[192698]: </domain>
Oct 01 14:17:28 compute-0 nova_compute[192698]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Oct 01 14:17:28 compute-0 nova_compute[192698]: 2025-10-01 14:17:28.652 2 DEBUG nova.compute.manager [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Preparing to wait for external event network-vif-plugged-23321592-5912-475c-80cc-9fe5944d128d prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Oct 01 14:17:28 compute-0 nova_compute[192698]: 2025-10-01 14:17:28.653 2 DEBUG oslo_concurrency.lockutils [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Acquiring lock "aad5638f-3b4c-43c9-a453-2cd987bcc593-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:17:28 compute-0 nova_compute[192698]: 2025-10-01 14:17:28.653 2 DEBUG oslo_concurrency.lockutils [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "aad5638f-3b4c-43c9-a453-2cd987bcc593-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:17:28 compute-0 nova_compute[192698]: 2025-10-01 14:17:28.654 2 DEBUG oslo_concurrency.lockutils [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "aad5638f-3b4c-43c9-a453-2cd987bcc593-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:17:28 compute-0 nova_compute[192698]: 2025-10-01 14:17:28.654 2 DEBUG nova.virt.libvirt.vif [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-01T14:17:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-624439701',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-624439701',id=17,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d43115e3729442e1b68b749acc0dabc8',ramdisk_id='',reservation_id='r-l3gig426',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-30131345',owner_user_name='tempest-TestExecuteStrategies-30131345-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-01T14:17:23Z,user_data=None,user_id='f8897741e6ca4770b56d28d05fa3fc42',uuid=aad5638f-3b4c-43c9-a453-2cd987bcc593,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "23321592-5912-475c-80cc-9fe5944d128d", "address": "fa:16:3e:63:24:e1", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23321592-59", "ovs_interfaceid": "23321592-5912-475c-80cc-9fe5944d128d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 01 14:17:28 compute-0 nova_compute[192698]: 2025-10-01 14:17:28.655 2 DEBUG nova.network.os_vif_util [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Converting VIF {"id": "23321592-5912-475c-80cc-9fe5944d128d", "address": "fa:16:3e:63:24:e1", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23321592-59", "ovs_interfaceid": "23321592-5912-475c-80cc-9fe5944d128d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:17:28 compute-0 nova_compute[192698]: 2025-10-01 14:17:28.655 2 DEBUG nova.network.os_vif_util [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:63:24:e1,bridge_name='br-int',has_traffic_filtering=True,id=23321592-5912-475c-80cc-9fe5944d128d,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap23321592-59') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:17:28 compute-0 nova_compute[192698]: 2025-10-01 14:17:28.656 2 DEBUG os_vif [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:63:24:e1,bridge_name='br-int',has_traffic_filtering=True,id=23321592-5912-475c-80cc-9fe5944d128d,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap23321592-59') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 01 14:17:28 compute-0 nova_compute[192698]: 2025-10-01 14:17:28.656 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:17:28 compute-0 nova_compute[192698]: 2025-10-01 14:17:28.656 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:17:28 compute-0 nova_compute[192698]: 2025-10-01 14:17:28.657 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:17:28 compute-0 nova_compute[192698]: 2025-10-01 14:17:28.657 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:17:28 compute-0 nova_compute[192698]: 2025-10-01 14:17:28.657 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'e6b1913a-cb1d-5095-8ba1-9795b03f4be3', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:17:28 compute-0 nova_compute[192698]: 2025-10-01 14:17:28.659 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:17:28 compute-0 nova_compute[192698]: 2025-10-01 14:17:28.660 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:17:28 compute-0 nova_compute[192698]: 2025-10-01 14:17:28.663 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:17:28 compute-0 nova_compute[192698]: 2025-10-01 14:17:28.664 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap23321592-59, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:17:28 compute-0 nova_compute[192698]: 2025-10-01 14:17:28.664 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap23321592-59, col_values=(('qos', UUID('c5a21c36-7c10-4df0-b8fa-cc37e99531cf')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:17:28 compute-0 nova_compute[192698]: 2025-10-01 14:17:28.664 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap23321592-59, col_values=(('external_ids', {'iface-id': '23321592-5912-475c-80cc-9fe5944d128d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:63:24:e1', 'vm-uuid': 'aad5638f-3b4c-43c9-a453-2cd987bcc593'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:17:28 compute-0 nova_compute[192698]: 2025-10-01 14:17:28.665 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:17:28 compute-0 nova_compute[192698]: 2025-10-01 14:17:28.666 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:17:28 compute-0 nova_compute[192698]: 2025-10-01 14:17:28.667 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:17:28 compute-0 NetworkManager[51741]: <info>  [1759328248.6683] manager: (tap23321592-59): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/54)
Oct 01 14:17:28 compute-0 nova_compute[192698]: 2025-10-01 14:17:28.674 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:17:28 compute-0 nova_compute[192698]: 2025-10-01 14:17:28.674 2 INFO os_vif [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:63:24:e1,bridge_name='br-int',has_traffic_filtering=True,id=23321592-5912-475c-80cc-9fe5944d128d,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap23321592-59')
Oct 01 14:17:28 compute-0 nova_compute[192698]: 2025-10-01 14:17:28.721 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:17:29 compute-0 podman[203144]: time="2025-10-01T14:17:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:17:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:17:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:17:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:17:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3021 "" "Go-http-client/1.1"
Oct 01 14:17:30 compute-0 nova_compute[192698]: 2025-10-01 14:17:30.211 2 DEBUG nova.virt.libvirt.driver [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 01 14:17:30 compute-0 nova_compute[192698]: 2025-10-01 14:17:30.212 2 DEBUG nova.virt.libvirt.driver [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 01 14:17:30 compute-0 nova_compute[192698]: 2025-10-01 14:17:30.212 2 DEBUG nova.virt.libvirt.driver [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] No VIF found with MAC fa:16:3e:63:24:e1, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Oct 01 14:17:30 compute-0 nova_compute[192698]: 2025-10-01 14:17:30.213 2 INFO nova.virt.libvirt.driver [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Using config drive
Oct 01 14:17:30 compute-0 nova_compute[192698]: 2025-10-01 14:17:30.724 2 WARNING neutronclient.v2_0.client [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:17:31 compute-0 nova_compute[192698]: 2025-10-01 14:17:31.334 2 INFO nova.virt.libvirt.driver [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Creating config drive at /var/lib/nova/instances/aad5638f-3b4c-43c9-a453-2cd987bcc593/disk.config
Oct 01 14:17:31 compute-0 nova_compute[192698]: 2025-10-01 14:17:31.346 2 DEBUG oslo_concurrency.processutils [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/aad5638f-3b4c-43c9-a453-2cd987bcc593/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmp6y486v2f execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:17:31 compute-0 openstack_network_exporter[205307]: ERROR   14:17:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:17:31 compute-0 openstack_network_exporter[205307]: ERROR   14:17:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:17:31 compute-0 openstack_network_exporter[205307]: ERROR   14:17:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:17:31 compute-0 openstack_network_exporter[205307]: ERROR   14:17:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:17:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:17:31 compute-0 openstack_network_exporter[205307]: ERROR   14:17:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:17:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:17:31 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Oct 01 14:17:31 compute-0 nova_compute[192698]: 2025-10-01 14:17:31.493 2 DEBUG oslo_concurrency.processutils [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/aad5638f-3b4c-43c9-a453-2cd987bcc593/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmp6y486v2f" returned: 0 in 0.147s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:17:31 compute-0 kernel: tap23321592-59: entered promiscuous mode
Oct 01 14:17:31 compute-0 NetworkManager[51741]: <info>  [1759328251.5820] manager: (tap23321592-59): new Tun device (/org/freedesktop/NetworkManager/Devices/55)
Oct 01 14:17:31 compute-0 ovn_controller[94909]: 2025-10-01T14:17:31Z|00135|binding|INFO|Claiming lport 23321592-5912-475c-80cc-9fe5944d128d for this chassis.
Oct 01 14:17:31 compute-0 ovn_controller[94909]: 2025-10-01T14:17:31Z|00136|binding|INFO|23321592-5912-475c-80cc-9fe5944d128d: Claiming fa:16:3e:63:24:e1 10.100.0.6
Oct 01 14:17:31 compute-0 systemd-udevd[221504]: Network interface NamePolicy= disabled on kernel command line.
Oct 01 14:17:31 compute-0 nova_compute[192698]: 2025-10-01 14:17:31.634 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:17:31 compute-0 nova_compute[192698]: 2025-10-01 14:17:31.643 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:17:31 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:17:31.657 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:63:24:e1 10.100.0.6'], port_security=['fa:16:3e:63:24:e1 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'aad5638f-3b4c-43c9-a453-2cd987bcc593', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-031a8987-8430-4fb6-a464-01e4dca2fae7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd43115e3729442e1b68b749acc0dabc8', 'neutron:revision_number': '4', 'neutron:security_group_ids': '43a3232d-93b1-43af-a9a3-1fde49b4460d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bd1914da-f1b0-4097-9d6b-24a3870871dc, chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], logical_port=23321592-5912-475c-80cc-9fe5944d128d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:17:31 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:17:31.658 103791 INFO neutron.agent.ovn.metadata.agent [-] Port 23321592-5912-475c-80cc-9fe5944d128d in datapath 031a8987-8430-4fb6-a464-01e4dca2fae7 bound to our chassis
Oct 01 14:17:31 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:17:31.660 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 031a8987-8430-4fb6-a464-01e4dca2fae7
Oct 01 14:17:31 compute-0 NetworkManager[51741]: <info>  [1759328251.6649] device (tap23321592-59): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 01 14:17:31 compute-0 NetworkManager[51741]: <info>  [1759328251.6666] device (tap23321592-59): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 01 14:17:31 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:17:31.680 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[68bdf71a-bad2-4282-a8ae-4b3da0710472]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:17:31 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:17:31.681 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap031a8987-81 in ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Oct 01 14:17:31 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:17:31.684 214114 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap031a8987-80 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Oct 01 14:17:31 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:17:31.684 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[75a0638b-db6e-4f1a-800b-d1f52e107bd9]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:17:31 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:17:31.686 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[6f7e64dc-e1a2-4b7a-ad49-4124f8b37136]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:17:31 compute-0 systemd-machined[152704]: New machine qemu-12-instance-00000011.
Oct 01 14:17:31 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:17:31.707 103910 DEBUG oslo.privsep.daemon [-] privsep: reply[3a3d530d-8d16-4a90-8f2e-cb660c6402e9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:17:31 compute-0 systemd[1]: Started Virtual Machine qemu-12-instance-00000011.
Oct 01 14:17:31 compute-0 ovn_controller[94909]: 2025-10-01T14:17:31Z|00137|binding|INFO|Setting lport 23321592-5912-475c-80cc-9fe5944d128d ovn-installed in OVS
Oct 01 14:17:31 compute-0 ovn_controller[94909]: 2025-10-01T14:17:31Z|00138|binding|INFO|Setting lport 23321592-5912-475c-80cc-9fe5944d128d up in Southbound
Oct 01 14:17:31 compute-0 nova_compute[192698]: 2025-10-01 14:17:31.717 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:17:31 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:17:31.724 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[958c7619-85b1-4bf6-aa24-7db8b8111b80]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:17:31 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:17:31.766 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[221ba3d3-d269-49d1-9942-df867aecfaae]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:17:31 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:17:31.771 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[8704208d-a220-4fb1-883b-be7f0c6b249f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:17:31 compute-0 NetworkManager[51741]: <info>  [1759328251.7735] manager: (tap031a8987-80): new Veth device (/org/freedesktop/NetworkManager/Devices/56)
Oct 01 14:17:31 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:17:31.813 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[c3894043-328d-4582-80b6-32a7e8902ebe]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:17:31 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:17:31.816 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[ac3776ed-3725-415f-8dea-7a1be77645d0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:17:31 compute-0 NetworkManager[51741]: <info>  [1759328251.8461] device (tap031a8987-80): carrier: link connected
Oct 01 14:17:31 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:17:31.854 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[68399867-e75f-4a6c-89df-76478145f69c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:17:31 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:17:31.875 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[0deb515c-1cad-4650-a28a-f869d356450a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap031a8987-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:79:6c:81'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 461237, 'reachable_time': 33178, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221540, 'error': None, 'target': 'ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:17:31 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:17:31.900 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[07b6e970-b993-49d2-80e8-3ef1e31904f5]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe79:6c81'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 461237, 'tstamp': 461237}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221541, 'error': None, 'target': 'ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:17:31 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:17:31.919 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[016cd732-e13f-4f26-996d-36906af48457]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap031a8987-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:79:6c:81'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 461237, 'reachable_time': 33178, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221542, 'error': None, 'target': 'ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:17:31 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:17:31.966 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[036981bd-826c-4e67-bc8e-16b23f247abd]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:17:32 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:17:32.061 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[d942608e-5ae8-451f-9a36-5a3540d62e16]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:17:32 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:17:32.063 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap031a8987-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:17:32 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:17:32.063 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:17:32 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:17:32.064 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap031a8987-80, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:17:32 compute-0 nova_compute[192698]: 2025-10-01 14:17:32.066 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:17:32 compute-0 NetworkManager[51741]: <info>  [1759328252.0678] manager: (tap031a8987-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/57)
Oct 01 14:17:32 compute-0 kernel: tap031a8987-80: entered promiscuous mode
Oct 01 14:17:32 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:17:32.070 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap031a8987-80, col_values=(('external_ids', {'iface-id': '6dd814dc-cba2-4392-85ef-eadb8c4615f7'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:17:32 compute-0 nova_compute[192698]: 2025-10-01 14:17:32.072 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:17:32 compute-0 ovn_controller[94909]: 2025-10-01T14:17:32Z|00139|binding|INFO|Releasing lport 6dd814dc-cba2-4392-85ef-eadb8c4615f7 from this chassis (sb_readonly=0)
Oct 01 14:17:32 compute-0 nova_compute[192698]: 2025-10-01 14:17:32.093 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:17:32 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:17:32.097 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[46289b31-d780-4230-88a5-0048218ba0cd]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:17:32 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:17:32.098 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:17:32 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:17:32.099 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:17:32 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:17:32.099 103791 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 031a8987-8430-4fb6-a464-01e4dca2fae7 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Oct 01 14:17:32 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:17:32.099 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:17:32 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:17:32.100 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[5e3b5506-a684-4aaf-9880-62fd3e43205b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:17:32 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:17:32.100 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:17:32 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:17:32.101 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[74d71ef5-41ff-44a2-81b7-89251ac65a21]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:17:32 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:17:32.102 103791 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Oct 01 14:17:32 compute-0 ovn_metadata_agent[103777]: global
Oct 01 14:17:32 compute-0 ovn_metadata_agent[103777]:     log         /dev/log local0 debug
Oct 01 14:17:32 compute-0 ovn_metadata_agent[103777]:     log-tag     haproxy-metadata-proxy-031a8987-8430-4fb6-a464-01e4dca2fae7
Oct 01 14:17:32 compute-0 ovn_metadata_agent[103777]:     user        root
Oct 01 14:17:32 compute-0 ovn_metadata_agent[103777]:     group       root
Oct 01 14:17:32 compute-0 ovn_metadata_agent[103777]:     maxconn     1024
Oct 01 14:17:32 compute-0 ovn_metadata_agent[103777]:     pidfile     /var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy
Oct 01 14:17:32 compute-0 ovn_metadata_agent[103777]:     daemon
Oct 01 14:17:32 compute-0 ovn_metadata_agent[103777]: 
Oct 01 14:17:32 compute-0 ovn_metadata_agent[103777]: defaults
Oct 01 14:17:32 compute-0 ovn_metadata_agent[103777]:     log global
Oct 01 14:17:32 compute-0 ovn_metadata_agent[103777]:     mode http
Oct 01 14:17:32 compute-0 ovn_metadata_agent[103777]:     option httplog
Oct 01 14:17:32 compute-0 ovn_metadata_agent[103777]:     option dontlognull
Oct 01 14:17:32 compute-0 ovn_metadata_agent[103777]:     option http-server-close
Oct 01 14:17:32 compute-0 ovn_metadata_agent[103777]:     option forwardfor
Oct 01 14:17:32 compute-0 ovn_metadata_agent[103777]:     retries                 3
Oct 01 14:17:32 compute-0 ovn_metadata_agent[103777]:     timeout http-request    30s
Oct 01 14:17:32 compute-0 ovn_metadata_agent[103777]:     timeout connect         30s
Oct 01 14:17:32 compute-0 ovn_metadata_agent[103777]:     timeout client          32s
Oct 01 14:17:32 compute-0 ovn_metadata_agent[103777]:     timeout server          32s
Oct 01 14:17:32 compute-0 ovn_metadata_agent[103777]:     timeout http-keep-alive 30s
Oct 01 14:17:32 compute-0 ovn_metadata_agent[103777]: 
Oct 01 14:17:32 compute-0 ovn_metadata_agent[103777]: listen listener
Oct 01 14:17:32 compute-0 ovn_metadata_agent[103777]:     bind 169.254.169.254:80
Oct 01 14:17:32 compute-0 ovn_metadata_agent[103777]:     
Oct 01 14:17:32 compute-0 ovn_metadata_agent[103777]:     server metadata /var/lib/neutron/metadata_proxy
Oct 01 14:17:32 compute-0 ovn_metadata_agent[103777]: 
Oct 01 14:17:32 compute-0 ovn_metadata_agent[103777]:     http-request add-header X-OVN-Network-ID 031a8987-8430-4fb6-a464-01e4dca2fae7
Oct 01 14:17:32 compute-0 ovn_metadata_agent[103777]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Oct 01 14:17:32 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:17:32.103 103791 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7', 'env', 'PROCESS_TAG=haproxy-031a8987-8430-4fb6-a464-01e4dca2fae7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/031a8987-8430-4fb6-a464-01e4dca2fae7.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Oct 01 14:17:32 compute-0 nova_compute[192698]: 2025-10-01 14:17:32.372 2 DEBUG nova.compute.manager [req-1da62d09-3e1e-4570-bc7c-fd03337a41a1 req-b83595d3-b0c3-41a6-b06a-5cb1bafa77e4 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Received event network-vif-plugged-23321592-5912-475c-80cc-9fe5944d128d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:17:32 compute-0 nova_compute[192698]: 2025-10-01 14:17:32.374 2 DEBUG oslo_concurrency.lockutils [req-1da62d09-3e1e-4570-bc7c-fd03337a41a1 req-b83595d3-b0c3-41a6-b06a-5cb1bafa77e4 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "aad5638f-3b4c-43c9-a453-2cd987bcc593-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:17:32 compute-0 nova_compute[192698]: 2025-10-01 14:17:32.375 2 DEBUG oslo_concurrency.lockutils [req-1da62d09-3e1e-4570-bc7c-fd03337a41a1 req-b83595d3-b0c3-41a6-b06a-5cb1bafa77e4 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "aad5638f-3b4c-43c9-a453-2cd987bcc593-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:17:32 compute-0 nova_compute[192698]: 2025-10-01 14:17:32.376 2 DEBUG oslo_concurrency.lockutils [req-1da62d09-3e1e-4570-bc7c-fd03337a41a1 req-b83595d3-b0c3-41a6-b06a-5cb1bafa77e4 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "aad5638f-3b4c-43c9-a453-2cd987bcc593-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:17:32 compute-0 nova_compute[192698]: 2025-10-01 14:17:32.377 2 DEBUG nova.compute.manager [req-1da62d09-3e1e-4570-bc7c-fd03337a41a1 req-b83595d3-b0c3-41a6-b06a-5cb1bafa77e4 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Processing event network-vif-plugged-23321592-5912-475c-80cc-9fe5944d128d _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Oct 01 14:17:32 compute-0 podman[221581]: 2025-10-01 14:17:32.62764047 +0000 UTC m=+0.082444074 container create c23feed40c642ca91a4f23527f3c26744f6482bba7ddf7a250ce0fdf672852d0 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930)
Oct 01 14:17:32 compute-0 nova_compute[192698]: 2025-10-01 14:17:32.646 2 DEBUG nova.compute.manager [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Oct 01 14:17:32 compute-0 nova_compute[192698]: 2025-10-01 14:17:32.651 2 DEBUG nova.virt.libvirt.driver [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Oct 01 14:17:32 compute-0 nova_compute[192698]: 2025-10-01 14:17:32.656 2 INFO nova.virt.libvirt.driver [-] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Instance spawned successfully.
Oct 01 14:17:32 compute-0 nova_compute[192698]: 2025-10-01 14:17:32.656 2 DEBUG nova.virt.libvirt.driver [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Oct 01 14:17:32 compute-0 podman[221581]: 2025-10-01 14:17:32.592247961 +0000 UTC m=+0.047051595 image pull 0c139338a67144a0d88e07ef5f38b20d3085af4a1586fd8115d3776c8f9c633c 38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Oct 01 14:17:32 compute-0 systemd[1]: Started libpod-conmon-c23feed40c642ca91a4f23527f3c26744f6482bba7ddf7a250ce0fdf672852d0.scope.
Oct 01 14:17:32 compute-0 systemd[1]: Started libcrun container.
Oct 01 14:17:32 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ac1ed7bcb47da107e4572e7d3f71d82895e1c01344592b2461070e85d70ef618/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 01 14:17:32 compute-0 podman[221581]: 2025-10-01 14:17:32.749182892 +0000 UTC m=+0.203986566 container init c23feed40c642ca91a4f23527f3c26744f6482bba7ddf7a250ce0fdf672852d0 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Oct 01 14:17:32 compute-0 podman[221581]: 2025-10-01 14:17:32.757332733 +0000 UTC m=+0.212136307 container start c23feed40c642ca91a4f23527f3c26744f6482bba7ddf7a250ce0fdf672852d0 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0)
Oct 01 14:17:32 compute-0 neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7[221596]: [NOTICE]   (221600) : New worker (221602) forked
Oct 01 14:17:32 compute-0 neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7[221596]: [NOTICE]   (221600) : Loading success.
Oct 01 14:17:33 compute-0 nova_compute[192698]: 2025-10-01 14:17:33.175 2 DEBUG nova.virt.libvirt.driver [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:17:33 compute-0 nova_compute[192698]: 2025-10-01 14:17:33.176 2 DEBUG nova.virt.libvirt.driver [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:17:33 compute-0 nova_compute[192698]: 2025-10-01 14:17:33.177 2 DEBUG nova.virt.libvirt.driver [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:17:33 compute-0 nova_compute[192698]: 2025-10-01 14:17:33.178 2 DEBUG nova.virt.libvirt.driver [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:17:33 compute-0 nova_compute[192698]: 2025-10-01 14:17:33.179 2 DEBUG nova.virt.libvirt.driver [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:17:33 compute-0 nova_compute[192698]: 2025-10-01 14:17:33.180 2 DEBUG nova.virt.libvirt.driver [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:17:33 compute-0 nova_compute[192698]: 2025-10-01 14:17:33.665 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:17:33 compute-0 nova_compute[192698]: 2025-10-01 14:17:33.690 2 INFO nova.compute.manager [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Took 9.19 seconds to spawn the instance on the hypervisor.
Oct 01 14:17:33 compute-0 nova_compute[192698]: 2025-10-01 14:17:33.691 2 DEBUG nova.compute.manager [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 01 14:17:33 compute-0 nova_compute[192698]: 2025-10-01 14:17:33.724 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:17:34 compute-0 nova_compute[192698]: 2025-10-01 14:17:34.239 2 INFO nova.compute.manager [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Took 14.44 seconds to build instance.
Oct 01 14:17:34 compute-0 nova_compute[192698]: 2025-10-01 14:17:34.439 2 DEBUG nova.compute.manager [req-93491767-4c98-40fb-9489-230fbd4fe94d req-6bf0afd7-a5ab-4e90-8f78-17c3453443e6 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Received event network-vif-plugged-23321592-5912-475c-80cc-9fe5944d128d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:17:34 compute-0 nova_compute[192698]: 2025-10-01 14:17:34.440 2 DEBUG oslo_concurrency.lockutils [req-93491767-4c98-40fb-9489-230fbd4fe94d req-6bf0afd7-a5ab-4e90-8f78-17c3453443e6 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "aad5638f-3b4c-43c9-a453-2cd987bcc593-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:17:34 compute-0 nova_compute[192698]: 2025-10-01 14:17:34.441 2 DEBUG oslo_concurrency.lockutils [req-93491767-4c98-40fb-9489-230fbd4fe94d req-6bf0afd7-a5ab-4e90-8f78-17c3453443e6 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "aad5638f-3b4c-43c9-a453-2cd987bcc593-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:17:34 compute-0 nova_compute[192698]: 2025-10-01 14:17:34.442 2 DEBUG oslo_concurrency.lockutils [req-93491767-4c98-40fb-9489-230fbd4fe94d req-6bf0afd7-a5ab-4e90-8f78-17c3453443e6 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "aad5638f-3b4c-43c9-a453-2cd987bcc593-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:17:34 compute-0 nova_compute[192698]: 2025-10-01 14:17:34.442 2 DEBUG nova.compute.manager [req-93491767-4c98-40fb-9489-230fbd4fe94d req-6bf0afd7-a5ab-4e90-8f78-17c3453443e6 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] No waiting events found dispatching network-vif-plugged-23321592-5912-475c-80cc-9fe5944d128d pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:17:34 compute-0 nova_compute[192698]: 2025-10-01 14:17:34.443 2 WARNING nova.compute.manager [req-93491767-4c98-40fb-9489-230fbd4fe94d req-6bf0afd7-a5ab-4e90-8f78-17c3453443e6 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Received unexpected event network-vif-plugged-23321592-5912-475c-80cc-9fe5944d128d for instance with vm_state active and task_state None.
Oct 01 14:17:34 compute-0 nova_compute[192698]: 2025-10-01 14:17:34.745 2 DEBUG oslo_concurrency.lockutils [None req-8706ded6-21ba-4927-b05f-09be6ab5ee38 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "aad5638f-3b4c-43c9-a453-2cd987bcc593" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.966s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:17:36 compute-0 podman[221611]: 2025-10-01 14:17:36.150967214 +0000 UTC m=+0.066084951 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Oct 01 14:17:36 compute-0 podman[221612]: 2025-10-01 14:17:36.27115396 +0000 UTC m=+0.170620613 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Oct 01 14:17:38 compute-0 nova_compute[192698]: 2025-10-01 14:17:38.668 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:17:38 compute-0 nova_compute[192698]: 2025-10-01 14:17:38.728 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:17:43 compute-0 nova_compute[192698]: 2025-10-01 14:17:43.672 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:17:43 compute-0 nova_compute[192698]: 2025-10-01 14:17:43.731 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:17:44 compute-0 podman[221673]: 2025-10-01 14:17:44.210549529 +0000 UTC m=+0.108777398 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, name=ubi9-minimal, io.buildah.version=1.33.7, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, distribution-scope=public, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, managed_by=edpm_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct 01 14:17:45 compute-0 ovn_controller[94909]: 2025-10-01T14:17:45Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:63:24:e1 10.100.0.6
Oct 01 14:17:45 compute-0 ovn_controller[94909]: 2025-10-01T14:17:45Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:63:24:e1 10.100.0.6
Oct 01 14:17:48 compute-0 nova_compute[192698]: 2025-10-01 14:17:48.675 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:17:48 compute-0 nova_compute[192698]: 2025-10-01 14:17:48.732 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:17:49 compute-0 podman[221695]: 2025-10-01 14:17:49.198776172 +0000 UTC m=+0.104357658 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Oct 01 14:17:49 compute-0 podman[221696]: 2025-10-01 14:17:49.21789189 +0000 UTC m=+0.113864715 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct 01 14:17:53 compute-0 nova_compute[192698]: 2025-10-01 14:17:53.677 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:17:53 compute-0 nova_compute[192698]: 2025-10-01 14:17:53.734 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:17:55 compute-0 podman[221736]: 2025-10-01 14:17:55.202851411 +0000 UTC m=+0.108238933 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 01 14:17:55 compute-0 nova_compute[192698]: 2025-10-01 14:17:55.738 2 DEBUG nova.compute.manager [None req-bdfc7d08-5c90-4ff3-afb4-3c2fc58b3a29 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider ee1e54f5-453b-4949-a499-9a192f03b8f0 in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:635
Oct 01 14:17:55 compute-0 nova_compute[192698]: 2025-10-01 14:17:55.787 2 DEBUG nova.compute.provider_tree [None req-bdfc7d08-5c90-4ff3-afb4-3c2fc58b3a29 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Updating resource provider ee1e54f5-453b-4949-a499-9a192f03b8f0 generation from 16 to 20 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Oct 01 14:17:58 compute-0 nova_compute[192698]: 2025-10-01 14:17:58.680 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:17:58 compute-0 nova_compute[192698]: 2025-10-01 14:17:58.738 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:17:59 compute-0 podman[203144]: time="2025-10-01T14:17:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:17:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:17:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20750 "" "Go-http-client/1.1"
Oct 01 14:17:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:17:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3484 "" "Go-http-client/1.1"
Oct 01 14:18:01 compute-0 openstack_network_exporter[205307]: ERROR   14:18:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:18:01 compute-0 openstack_network_exporter[205307]: ERROR   14:18:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:18:01 compute-0 openstack_network_exporter[205307]: ERROR   14:18:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:18:01 compute-0 openstack_network_exporter[205307]: ERROR   14:18:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:18:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:18:01 compute-0 openstack_network_exporter[205307]: ERROR   14:18:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:18:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:18:02 compute-0 ovn_controller[94909]: 2025-10-01T14:18:02Z|00140|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Oct 01 14:18:03 compute-0 nova_compute[192698]: 2025-10-01 14:18:03.683 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:18:03 compute-0 nova_compute[192698]: 2025-10-01 14:18:03.741 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:18:07 compute-0 nova_compute[192698]: 2025-10-01 14:18:07.036 2 DEBUG nova.virt.libvirt.driver [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Check if temp file /var/lib/nova/instances/tmp1ffvczmt exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Oct 01 14:18:07 compute-0 nova_compute[192698]: 2025-10-01 14:18:07.042 2 DEBUG nova.compute.manager [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp1ffvczmt',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='aad5638f-3b4c-43c9-a453-2cd987bcc593',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9294
Oct 01 14:18:07 compute-0 podman[221760]: 2025-10-01 14:18:07.176687418 +0000 UTC m=+0.085027165 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, io.buildah.version=1.41.4)
Oct 01 14:18:07 compute-0 podman[221761]: 2025-10-01 14:18:07.239956831 +0000 UTC m=+0.143340583 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 01 14:18:08 compute-0 nova_compute[192698]: 2025-10-01 14:18:08.687 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:18:08 compute-0 nova_compute[192698]: 2025-10-01 14:18:08.743 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:18:12 compute-0 nova_compute[192698]: 2025-10-01 14:18:12.530 2 DEBUG oslo_concurrency.processutils [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/aad5638f-3b4c-43c9-a453-2cd987bcc593/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:18:12 compute-0 nova_compute[192698]: 2025-10-01 14:18:12.587 2 DEBUG oslo_concurrency.processutils [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/aad5638f-3b4c-43c9-a453-2cd987bcc593/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:18:12 compute-0 nova_compute[192698]: 2025-10-01 14:18:12.588 2 DEBUG oslo_concurrency.processutils [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/aad5638f-3b4c-43c9-a453-2cd987bcc593/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:18:12 compute-0 nova_compute[192698]: 2025-10-01 14:18:12.649 2 DEBUG oslo_concurrency.processutils [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/aad5638f-3b4c-43c9-a453-2cd987bcc593/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:18:12 compute-0 nova_compute[192698]: 2025-10-01 14:18:12.650 2 DEBUG nova.compute.manager [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Preparing to wait for external event network-vif-plugged-23321592-5912-475c-80cc-9fe5944d128d prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Oct 01 14:18:12 compute-0 nova_compute[192698]: 2025-10-01 14:18:12.651 2 DEBUG oslo_concurrency.lockutils [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "aad5638f-3b4c-43c9-a453-2cd987bcc593-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:18:12 compute-0 nova_compute[192698]: 2025-10-01 14:18:12.651 2 DEBUG oslo_concurrency.lockutils [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "aad5638f-3b4c-43c9-a453-2cd987bcc593-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:18:12 compute-0 nova_compute[192698]: 2025-10-01 14:18:12.651 2 DEBUG oslo_concurrency.lockutils [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "aad5638f-3b4c-43c9-a453-2cd987bcc593-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:18:13 compute-0 nova_compute[192698]: 2025-10-01 14:18:13.689 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:18:13 compute-0 nova_compute[192698]: 2025-10-01 14:18:13.745 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:18:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:18:14.265 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:18:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:18:14.266 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:18:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:18:14.267 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:18:14 compute-0 nova_compute[192698]: 2025-10-01 14:18:14.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:18:15 compute-0 podman[221812]: 2025-10-01 14:18:15.173316356 +0000 UTC m=+0.086495244 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9)
Oct 01 14:18:15 compute-0 nova_compute[192698]: 2025-10-01 14:18:15.446 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:18:15 compute-0 nova_compute[192698]: 2025-10-01 14:18:15.447 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:18:15 compute-0 nova_compute[192698]: 2025-10-01 14:18:15.448 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:18:15 compute-0 nova_compute[192698]: 2025-10-01 14:18:15.448 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 01 14:18:16 compute-0 nova_compute[192698]: 2025-10-01 14:18:16.497 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/aad5638f-3b4c-43c9-a453-2cd987bcc593/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:18:16 compute-0 nova_compute[192698]: 2025-10-01 14:18:16.578 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/aad5638f-3b4c-43c9-a453-2cd987bcc593/disk --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:18:16 compute-0 nova_compute[192698]: 2025-10-01 14:18:16.579 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/aad5638f-3b4c-43c9-a453-2cd987bcc593/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:18:16 compute-0 nova_compute[192698]: 2025-10-01 14:18:16.648 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/aad5638f-3b4c-43c9-a453-2cd987bcc593/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:18:16 compute-0 nova_compute[192698]: 2025-10-01 14:18:16.834 2 WARNING nova.virt.libvirt.driver [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:18:16 compute-0 nova_compute[192698]: 2025-10-01 14:18:16.835 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:18:16 compute-0 nova_compute[192698]: 2025-10-01 14:18:16.877 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:18:16 compute-0 nova_compute[192698]: 2025-10-01 14:18:16.877 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5681MB free_disk=73.27425003051758GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 01 14:18:16 compute-0 nova_compute[192698]: 2025-10-01 14:18:16.878 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:18:16 compute-0 nova_compute[192698]: 2025-10-01 14:18:16.878 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:18:17 compute-0 nova_compute[192698]: 2025-10-01 14:18:17.897 2 INFO nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Updating resource usage from migration 2925e0b3-6229-4941-a681-1afe3691fe7f
Oct 01 14:18:17 compute-0 nova_compute[192698]: 2025-10-01 14:18:17.930 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Migration 2925e0b3-6229-4941-a681-1afe3691fe7f is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Oct 01 14:18:17 compute-0 nova_compute[192698]: 2025-10-01 14:18:17.931 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 01 14:18:17 compute-0 nova_compute[192698]: 2025-10-01 14:18:17.931 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:18:16 up  1:17,  0 user,  load average: 0.33, 0.25, 0.35\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_migrating': '1', 'num_os_type_None': '1', 'num_proj_d43115e3729442e1b68b749acc0dabc8': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 01 14:18:17 compute-0 nova_compute[192698]: 2025-10-01 14:18:17.949 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Refreshing inventories for resource provider ee1e54f5-453b-4949-a499-9a192f03b8f0 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Oct 01 14:18:17 compute-0 nova_compute[192698]: 2025-10-01 14:18:17.964 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Updating ProviderTree inventory for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Oct 01 14:18:17 compute-0 nova_compute[192698]: 2025-10-01 14:18:17.964 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Updating inventory in ProviderTree for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Oct 01 14:18:17 compute-0 nova_compute[192698]: 2025-10-01 14:18:17.981 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Refreshing aggregate associations for resource provider ee1e54f5-453b-4949-a499-9a192f03b8f0, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Oct 01 14:18:18 compute-0 nova_compute[192698]: 2025-10-01 14:18:18.003 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Refreshing trait associations for resource provider ee1e54f5-453b-4949-a499-9a192f03b8f0, traits: COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_TPM_TIS,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_BMI2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_AVX2,COMPUTE_ARCH_X86_64,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SHA,COMPUTE_SOUND_MODEL_AC97,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_SOUND_MODEL_ES1370,HW_ARCH_X86_64,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_DEVICE_TAGGING,COMPUTE_SECURITY_TPM_CRB,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SOUND_MODEL_SB16,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SOUND_MODEL_USB,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_ACCELERATORS,HW_CPU_X86_CLMUL,HW_CPU_X86_AESNI,COMPUTE_NODE,HW_CPU_X86_SSSE3,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_RESCUE_BFV,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_FMA3,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_AVX,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_F16C,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_ABM,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STATUS_DISABLED,COMPUTE_STORAGE_VIRTIO_FS,HW_CPU_X86_SSE2,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE4A,HW_CPU_X86_SVM _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Oct 01 14:18:18 compute-0 nova_compute[192698]: 2025-10-01 14:18:18.043 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:18:18 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:18:18.428 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'e2:3f:3c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:1d:a6:67:ed:e6'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:18:18 compute-0 nova_compute[192698]: 2025-10-01 14:18:18.428 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:18:18 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:18:18.429 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 01 14:18:18 compute-0 nova_compute[192698]: 2025-10-01 14:18:18.453 2 DEBUG nova.compute.manager [req-31167898-fea8-4622-86db-84b6aaf3b7cb req-cfd1920c-6502-4e28-8fab-347cb2d26fa7 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Received event network-vif-unplugged-23321592-5912-475c-80cc-9fe5944d128d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:18:18 compute-0 nova_compute[192698]: 2025-10-01 14:18:18.454 2 DEBUG oslo_concurrency.lockutils [req-31167898-fea8-4622-86db-84b6aaf3b7cb req-cfd1920c-6502-4e28-8fab-347cb2d26fa7 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "aad5638f-3b4c-43c9-a453-2cd987bcc593-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:18:18 compute-0 nova_compute[192698]: 2025-10-01 14:18:18.454 2 DEBUG oslo_concurrency.lockutils [req-31167898-fea8-4622-86db-84b6aaf3b7cb req-cfd1920c-6502-4e28-8fab-347cb2d26fa7 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "aad5638f-3b4c-43c9-a453-2cd987bcc593-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:18:18 compute-0 nova_compute[192698]: 2025-10-01 14:18:18.454 2 DEBUG oslo_concurrency.lockutils [req-31167898-fea8-4622-86db-84b6aaf3b7cb req-cfd1920c-6502-4e28-8fab-347cb2d26fa7 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "aad5638f-3b4c-43c9-a453-2cd987bcc593-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:18:18 compute-0 nova_compute[192698]: 2025-10-01 14:18:18.454 2 DEBUG nova.compute.manager [req-31167898-fea8-4622-86db-84b6aaf3b7cb req-cfd1920c-6502-4e28-8fab-347cb2d26fa7 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] No event matching network-vif-unplugged-23321592-5912-475c-80cc-9fe5944d128d in dict_keys([('network-vif-plugged', '23321592-5912-475c-80cc-9fe5944d128d')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:349
Oct 01 14:18:18 compute-0 nova_compute[192698]: 2025-10-01 14:18:18.454 2 DEBUG nova.compute.manager [req-31167898-fea8-4622-86db-84b6aaf3b7cb req-cfd1920c-6502-4e28-8fab-347cb2d26fa7 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Received event network-vif-unplugged-23321592-5912-475c-80cc-9fe5944d128d for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 01 14:18:18 compute-0 nova_compute[192698]: 2025-10-01 14:18:18.551 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:18:18 compute-0 nova_compute[192698]: 2025-10-01 14:18:18.691 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:18:18 compute-0 nova_compute[192698]: 2025-10-01 14:18:18.747 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:18:19 compute-0 nova_compute[192698]: 2025-10-01 14:18:19.062 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 01 14:18:19 compute-0 nova_compute[192698]: 2025-10-01 14:18:19.062 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.184s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:18:19 compute-0 nova_compute[192698]: 2025-10-01 14:18:19.063 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:18:19 compute-0 nova_compute[192698]: 2025-10-01 14:18:19.063 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Oct 01 14:18:19 compute-0 nova_compute[192698]: 2025-10-01 14:18:19.569 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:18:19 compute-0 nova_compute[192698]: 2025-10-01 14:18:19.671 2 INFO nova.compute.manager [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Took 7.02 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Oct 01 14:18:20 compute-0 podman[221844]: 2025-10-01 14:18:20.155420174 +0000 UTC m=+0.068681041 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, container_name=iscsid, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct 01 14:18:20 compute-0 podman[221845]: 2025-10-01 14:18:20.159124535 +0000 UTC m=+0.066439001 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 01 14:18:20 compute-0 nova_compute[192698]: 2025-10-01 14:18:20.515 2 DEBUG nova.compute.manager [req-df6c960c-8208-44b6-842c-bc2713b4f86c req-0d4baeb0-25a5-4bc6-bc9d-91656cdd72c2 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Received event network-vif-plugged-23321592-5912-475c-80cc-9fe5944d128d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:18:20 compute-0 nova_compute[192698]: 2025-10-01 14:18:20.516 2 DEBUG oslo_concurrency.lockutils [req-df6c960c-8208-44b6-842c-bc2713b4f86c req-0d4baeb0-25a5-4bc6-bc9d-91656cdd72c2 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "aad5638f-3b4c-43c9-a453-2cd987bcc593-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:18:20 compute-0 nova_compute[192698]: 2025-10-01 14:18:20.516 2 DEBUG oslo_concurrency.lockutils [req-df6c960c-8208-44b6-842c-bc2713b4f86c req-0d4baeb0-25a5-4bc6-bc9d-91656cdd72c2 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "aad5638f-3b4c-43c9-a453-2cd987bcc593-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:18:20 compute-0 nova_compute[192698]: 2025-10-01 14:18:20.517 2 DEBUG oslo_concurrency.lockutils [req-df6c960c-8208-44b6-842c-bc2713b4f86c req-0d4baeb0-25a5-4bc6-bc9d-91656cdd72c2 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "aad5638f-3b4c-43c9-a453-2cd987bcc593-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:18:20 compute-0 nova_compute[192698]: 2025-10-01 14:18:20.517 2 DEBUG nova.compute.manager [req-df6c960c-8208-44b6-842c-bc2713b4f86c req-0d4baeb0-25a5-4bc6-bc9d-91656cdd72c2 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Processing event network-vif-plugged-23321592-5912-475c-80cc-9fe5944d128d _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Oct 01 14:18:20 compute-0 nova_compute[192698]: 2025-10-01 14:18:20.517 2 DEBUG nova.compute.manager [req-df6c960c-8208-44b6-842c-bc2713b4f86c req-0d4baeb0-25a5-4bc6-bc9d-91656cdd72c2 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Received event network-changed-23321592-5912-475c-80cc-9fe5944d128d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:18:20 compute-0 nova_compute[192698]: 2025-10-01 14:18:20.518 2 DEBUG nova.compute.manager [req-df6c960c-8208-44b6-842c-bc2713b4f86c req-0d4baeb0-25a5-4bc6-bc9d-91656cdd72c2 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Refreshing instance network info cache due to event network-changed-23321592-5912-475c-80cc-9fe5944d128d. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Oct 01 14:18:20 compute-0 nova_compute[192698]: 2025-10-01 14:18:20.518 2 DEBUG oslo_concurrency.lockutils [req-df6c960c-8208-44b6-842c-bc2713b4f86c req-0d4baeb0-25a5-4bc6-bc9d-91656cdd72c2 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "refresh_cache-aad5638f-3b4c-43c9-a453-2cd987bcc593" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 01 14:18:20 compute-0 nova_compute[192698]: 2025-10-01 14:18:20.518 2 DEBUG oslo_concurrency.lockutils [req-df6c960c-8208-44b6-842c-bc2713b4f86c req-0d4baeb0-25a5-4bc6-bc9d-91656cdd72c2 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquired lock "refresh_cache-aad5638f-3b4c-43c9-a453-2cd987bcc593" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 01 14:18:20 compute-0 nova_compute[192698]: 2025-10-01 14:18:20.519 2 DEBUG nova.network.neutron [req-df6c960c-8208-44b6-842c-bc2713b4f86c req-0d4baeb0-25a5-4bc6-bc9d-91656cdd72c2 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Refreshing network info cache for port 23321592-5912-475c-80cc-9fe5944d128d _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Oct 01 14:18:20 compute-0 nova_compute[192698]: 2025-10-01 14:18:20.521 2 DEBUG nova.compute.manager [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Oct 01 14:18:21 compute-0 nova_compute[192698]: 2025-10-01 14:18:21.027 2 WARNING neutronclient.v2_0.client [req-df6c960c-8208-44b6-842c-bc2713b4f86c req-0d4baeb0-25a5-4bc6-bc9d-91656cdd72c2 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:18:21 compute-0 nova_compute[192698]: 2025-10-01 14:18:21.030 2 DEBUG nova.compute.manager [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp1ffvczmt',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='aad5638f-3b4c-43c9-a453-2cd987bcc593',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(2925e0b3-6229-4941-a681-1afe3691fe7f),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9659
Oct 01 14:18:21 compute-0 nova_compute[192698]: 2025-10-01 14:18:21.547 2 DEBUG nova.objects.instance [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lazy-loading 'migration_context' on Instance uuid aad5638f-3b4c-43c9-a453-2cd987bcc593 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:18:21 compute-0 nova_compute[192698]: 2025-10-01 14:18:21.549 2 DEBUG nova.virt.libvirt.driver [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Oct 01 14:18:21 compute-0 nova_compute[192698]: 2025-10-01 14:18:21.551 2 DEBUG nova.virt.libvirt.driver [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Oct 01 14:18:21 compute-0 nova_compute[192698]: 2025-10-01 14:18:21.551 2 DEBUG nova.virt.libvirt.driver [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Oct 01 14:18:21 compute-0 nova_compute[192698]: 2025-10-01 14:18:21.612 2 WARNING neutronclient.v2_0.client [req-df6c960c-8208-44b6-842c-bc2713b4f86c req-0d4baeb0-25a5-4bc6-bc9d-91656cdd72c2 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:18:21 compute-0 nova_compute[192698]: 2025-10-01 14:18:21.766 2 DEBUG nova.network.neutron [req-df6c960c-8208-44b6-842c-bc2713b4f86c req-0d4baeb0-25a5-4bc6-bc9d-91656cdd72c2 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Updated VIF entry in instance network info cache for port 23321592-5912-475c-80cc-9fe5944d128d. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Oct 01 14:18:21 compute-0 nova_compute[192698]: 2025-10-01 14:18:21.767 2 DEBUG nova.network.neutron [req-df6c960c-8208-44b6-842c-bc2713b4f86c req-0d4baeb0-25a5-4bc6-bc9d-91656cdd72c2 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Updating instance_info_cache with network_info: [{"id": "23321592-5912-475c-80cc-9fe5944d128d", "address": "fa:16:3e:63:24:e1", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23321592-59", "ovs_interfaceid": "23321592-5912-475c-80cc-9fe5944d128d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:18:22 compute-0 nova_compute[192698]: 2025-10-01 14:18:22.054 2 DEBUG nova.virt.libvirt.driver [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Oct 01 14:18:22 compute-0 nova_compute[192698]: 2025-10-01 14:18:22.054 2 DEBUG nova.virt.libvirt.driver [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Oct 01 14:18:22 compute-0 nova_compute[192698]: 2025-10-01 14:18:22.071 2 DEBUG nova.virt.libvirt.vif [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-10-01T14:17:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-624439701',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-624439701',id=17,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-01T14:17:33Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d43115e3729442e1b68b749acc0dabc8',ramdisk_id='',reservation_id='r-l3gig426',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-30131345',owner_user_name='tempest-TestExecuteStrategies-30131345-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-01T14:17:33Z,user_data=None,user_id='f8897741e6ca4770b56d28d05fa3fc42',uuid=aad5638f-3b4c-43c9-a453-2cd987bcc593,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "23321592-5912-475c-80cc-9fe5944d128d", "address": "fa:16:3e:63:24:e1", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap23321592-59", "ovs_interfaceid": "23321592-5912-475c-80cc-9fe5944d128d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Oct 01 14:18:22 compute-0 nova_compute[192698]: 2025-10-01 14:18:22.071 2 DEBUG nova.network.os_vif_util [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Converting VIF {"id": "23321592-5912-475c-80cc-9fe5944d128d", "address": "fa:16:3e:63:24:e1", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap23321592-59", "ovs_interfaceid": "23321592-5912-475c-80cc-9fe5944d128d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:18:22 compute-0 nova_compute[192698]: 2025-10-01 14:18:22.072 2 DEBUG nova.network.os_vif_util [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:63:24:e1,bridge_name='br-int',has_traffic_filtering=True,id=23321592-5912-475c-80cc-9fe5944d128d,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap23321592-59') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:18:22 compute-0 nova_compute[192698]: 2025-10-01 14:18:22.073 2 DEBUG nova.virt.libvirt.migration [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Updating guest XML with vif config: <interface type="ethernet">
Oct 01 14:18:22 compute-0 nova_compute[192698]:   <mac address="fa:16:3e:63:24:e1"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   <model type="virtio"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   <driver name="vhost" rx_queue_size="512"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   <mtu size="1442"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   <target dev="tap23321592-59"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]: </interface>
Oct 01 14:18:22 compute-0 nova_compute[192698]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Oct 01 14:18:22 compute-0 nova_compute[192698]: 2025-10-01 14:18:22.074 2 DEBUG nova.virt.libvirt.migration [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Oct 01 14:18:22 compute-0 nova_compute[192698]:   <name>instance-00000011</name>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   <uuid>aad5638f-3b4c-43c9-a453-2cd987bcc593</uuid>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   <metadata>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <nova:name>tempest-TestExecuteStrategies-server-624439701</nova:name>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <nova:creationTime>2025-10-01 14:17:28</nova:creationTime>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <nova:flavor name="m1.nano" id="69702c4b-38f2-49d1-96d5-85671652c67e">
Oct 01 14:18:22 compute-0 nova_compute[192698]:         <nova:memory>128</nova:memory>
Oct 01 14:18:22 compute-0 nova_compute[192698]:         <nova:disk>1</nova:disk>
Oct 01 14:18:22 compute-0 nova_compute[192698]:         <nova:swap>0</nova:swap>
Oct 01 14:18:22 compute-0 nova_compute[192698]:         <nova:ephemeral>0</nova:ephemeral>
Oct 01 14:18:22 compute-0 nova_compute[192698]:         <nova:vcpus>1</nova:vcpus>
Oct 01 14:18:22 compute-0 nova_compute[192698]:         <nova:extraSpecs>
Oct 01 14:18:22 compute-0 nova_compute[192698]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Oct 01 14:18:22 compute-0 nova_compute[192698]:         </nova:extraSpecs>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       </nova:flavor>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <nova:image uuid="48696e9b-a20d-4bf6-8ac2-6438fe748ab6">
Oct 01 14:18:22 compute-0 nova_compute[192698]:         <nova:containerFormat>bare</nova:containerFormat>
Oct 01 14:18:22 compute-0 nova_compute[192698]:         <nova:diskFormat>qcow2</nova:diskFormat>
Oct 01 14:18:22 compute-0 nova_compute[192698]:         <nova:minDisk>1</nova:minDisk>
Oct 01 14:18:22 compute-0 nova_compute[192698]:         <nova:minRam>0</nova:minRam>
Oct 01 14:18:22 compute-0 nova_compute[192698]:         <nova:properties>
Oct 01 14:18:22 compute-0 nova_compute[192698]:           <nova:property name="hw_rng_model">virtio</nova:property>
Oct 01 14:18:22 compute-0 nova_compute[192698]:         </nova:properties>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       </nova:image>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <nova:owner>
Oct 01 14:18:22 compute-0 nova_compute[192698]:         <nova:user uuid="f8897741e6ca4770b56d28d05fa3fc42">tempest-TestExecuteStrategies-30131345-project-admin</nova:user>
Oct 01 14:18:22 compute-0 nova_compute[192698]:         <nova:project uuid="d43115e3729442e1b68b749acc0dabc8">tempest-TestExecuteStrategies-30131345</nova:project>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       </nova:owner>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <nova:root type="image" uuid="48696e9b-a20d-4bf6-8ac2-6438fe748ab6"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <nova:ports>
Oct 01 14:18:22 compute-0 nova_compute[192698]:         <nova:port uuid="23321592-5912-475c-80cc-9fe5944d128d">
Oct 01 14:18:22 compute-0 nova_compute[192698]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:         </nova:port>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       </nova:ports>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </nova:instance>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   </metadata>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   <memory unit="KiB">131072</memory>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   <currentMemory unit="KiB">131072</currentMemory>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   <vcpu placement="static">1</vcpu>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   <resource>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <partition>/machine</partition>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   </resource>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   <sysinfo type="smbios">
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <system>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <entry name="manufacturer">RDO</entry>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <entry name="product">OpenStack Compute</entry>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <entry name="serial">aad5638f-3b4c-43c9-a453-2cd987bcc593</entry>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <entry name="uuid">aad5638f-3b4c-43c9-a453-2cd987bcc593</entry>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <entry name="family">Virtual Machine</entry>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </system>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   </sysinfo>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   <os>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <boot dev="hd"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <smbios mode="sysinfo"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   </os>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   <features>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <acpi/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <apic/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <vmcoreinfo state="on"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   </features>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   <cpu mode="host-model" check="partial">
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   </cpu>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   <clock offset="utc">
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <timer name="pit" tickpolicy="delay"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <timer name="hpet" present="no"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   </clock>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   <on_poweroff>destroy</on_poweroff>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   <on_reboot>restart</on_reboot>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   <on_crash>destroy</on_crash>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   <devices>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <disk type="file" device="disk">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <source file="/var/lib/nova/instances/aad5638f-3b4c-43c9-a453-2cd987bcc593/disk"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target dev="vda" bus="virtio"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </disk>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <disk type="file" device="cdrom">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <driver name="qemu" type="raw" cache="none"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <source file="/var/lib/nova/instances/aad5638f-3b4c-43c9-a453-2cd987bcc593/disk.config"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target dev="sda" bus="sata"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <readonly/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </disk>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="0" model="pcie-root"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="1" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="1" port="0x10"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="2" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="2" port="0x11"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="3" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="3" port="0x12"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="4" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="4" port="0x13"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="5" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="5" port="0x14"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="6" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="6" port="0x15"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="7" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="7" port="0x16"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="8" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="8" port="0x17"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="9" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="9" port="0x18"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="10" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="10" port="0x19"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="11" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="11" port="0x1a"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="12" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="12" port="0x1b"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="13" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="13" port="0x1c"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="14" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="14" port="0x1d"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="15" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="15" port="0x1e"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="16" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="16" port="0x1f"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="17" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="17" port="0x20"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="18" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="18" port="0x21"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="19" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="19" port="0x22"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="20" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="20" port="0x23"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="21" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="21" port="0x24"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="22" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="22" port="0x25"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="23" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="23" port="0x26"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="24" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="24" port="0x27"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="25" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="25" port="0x28"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-pci-bridge"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="usb" index="0" model="piix3-uhci">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="sata" index="0">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <interface type="ethernet"><mac address="fa:16:3e:63:24:e1"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap23321592-59"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </interface><serial type="pty">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <log file="/var/lib/nova/instances/aad5638f-3b4c-43c9-a453-2cd987bcc593/console.log" append="off"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target type="isa-serial" port="0">
Oct 01 14:18:22 compute-0 nova_compute[192698]:         <model name="isa-serial"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       </target>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </serial>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <console type="pty">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <log file="/var/lib/nova/instances/aad5638f-3b4c-43c9-a453-2cd987bcc593/console.log" append="off"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target type="serial" port="0"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </console>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <input type="tablet" bus="usb">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="usb" bus="0" port="1"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </input>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <input type="mouse" bus="ps2"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <listen type="address" address="::"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </graphics>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <video>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model type="virtio" heads="1" primary="yes"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </video>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <stats period="10"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </memballoon>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <rng model="virtio">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <backend model="random">/dev/urandom</backend>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </rng>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   </devices>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]: </domain>
Oct 01 14:18:22 compute-0 nova_compute[192698]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Oct 01 14:18:22 compute-0 nova_compute[192698]: 2025-10-01 14:18:22.076 2 DEBUG nova.virt.libvirt.migration [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Oct 01 14:18:22 compute-0 nova_compute[192698]:   <name>instance-00000011</name>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   <uuid>aad5638f-3b4c-43c9-a453-2cd987bcc593</uuid>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   <metadata>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <nova:name>tempest-TestExecuteStrategies-server-624439701</nova:name>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <nova:creationTime>2025-10-01 14:17:28</nova:creationTime>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <nova:flavor name="m1.nano" id="69702c4b-38f2-49d1-96d5-85671652c67e">
Oct 01 14:18:22 compute-0 nova_compute[192698]:         <nova:memory>128</nova:memory>
Oct 01 14:18:22 compute-0 nova_compute[192698]:         <nova:disk>1</nova:disk>
Oct 01 14:18:22 compute-0 nova_compute[192698]:         <nova:swap>0</nova:swap>
Oct 01 14:18:22 compute-0 nova_compute[192698]:         <nova:ephemeral>0</nova:ephemeral>
Oct 01 14:18:22 compute-0 nova_compute[192698]:         <nova:vcpus>1</nova:vcpus>
Oct 01 14:18:22 compute-0 nova_compute[192698]:         <nova:extraSpecs>
Oct 01 14:18:22 compute-0 nova_compute[192698]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Oct 01 14:18:22 compute-0 nova_compute[192698]:         </nova:extraSpecs>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       </nova:flavor>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <nova:image uuid="48696e9b-a20d-4bf6-8ac2-6438fe748ab6">
Oct 01 14:18:22 compute-0 nova_compute[192698]:         <nova:containerFormat>bare</nova:containerFormat>
Oct 01 14:18:22 compute-0 nova_compute[192698]:         <nova:diskFormat>qcow2</nova:diskFormat>
Oct 01 14:18:22 compute-0 nova_compute[192698]:         <nova:minDisk>1</nova:minDisk>
Oct 01 14:18:22 compute-0 nova_compute[192698]:         <nova:minRam>0</nova:minRam>
Oct 01 14:18:22 compute-0 nova_compute[192698]:         <nova:properties>
Oct 01 14:18:22 compute-0 nova_compute[192698]:           <nova:property name="hw_rng_model">virtio</nova:property>
Oct 01 14:18:22 compute-0 nova_compute[192698]:         </nova:properties>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       </nova:image>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <nova:owner>
Oct 01 14:18:22 compute-0 nova_compute[192698]:         <nova:user uuid="f8897741e6ca4770b56d28d05fa3fc42">tempest-TestExecuteStrategies-30131345-project-admin</nova:user>
Oct 01 14:18:22 compute-0 nova_compute[192698]:         <nova:project uuid="d43115e3729442e1b68b749acc0dabc8">tempest-TestExecuteStrategies-30131345</nova:project>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       </nova:owner>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <nova:root type="image" uuid="48696e9b-a20d-4bf6-8ac2-6438fe748ab6"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <nova:ports>
Oct 01 14:18:22 compute-0 nova_compute[192698]:         <nova:port uuid="23321592-5912-475c-80cc-9fe5944d128d">
Oct 01 14:18:22 compute-0 nova_compute[192698]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:         </nova:port>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       </nova:ports>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </nova:instance>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   </metadata>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   <memory unit="KiB">131072</memory>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   <currentMemory unit="KiB">131072</currentMemory>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   <vcpu placement="static">1</vcpu>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   <resource>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <partition>/machine</partition>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   </resource>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   <sysinfo type="smbios">
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <system>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <entry name="manufacturer">RDO</entry>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <entry name="product">OpenStack Compute</entry>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <entry name="serial">aad5638f-3b4c-43c9-a453-2cd987bcc593</entry>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <entry name="uuid">aad5638f-3b4c-43c9-a453-2cd987bcc593</entry>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <entry name="family">Virtual Machine</entry>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </system>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   </sysinfo>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   <os>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <boot dev="hd"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <smbios mode="sysinfo"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   </os>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   <features>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <acpi/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <apic/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <vmcoreinfo state="on"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   </features>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   <cpu mode="host-model" check="partial">
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   </cpu>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   <clock offset="utc">
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <timer name="pit" tickpolicy="delay"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <timer name="hpet" present="no"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   </clock>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   <on_poweroff>destroy</on_poweroff>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   <on_reboot>restart</on_reboot>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   <on_crash>destroy</on_crash>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   <devices>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <disk type="file" device="disk">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <source file="/var/lib/nova/instances/aad5638f-3b4c-43c9-a453-2cd987bcc593/disk"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target dev="vda" bus="virtio"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </disk>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <disk type="file" device="cdrom">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <driver name="qemu" type="raw" cache="none"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <source file="/var/lib/nova/instances/aad5638f-3b4c-43c9-a453-2cd987bcc593/disk.config"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target dev="sda" bus="sata"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <readonly/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </disk>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="0" model="pcie-root"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="1" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="1" port="0x10"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="2" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="2" port="0x11"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="3" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="3" port="0x12"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="4" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="4" port="0x13"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="5" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="5" port="0x14"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="6" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="6" port="0x15"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="7" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="7" port="0x16"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="8" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="8" port="0x17"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="9" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="9" port="0x18"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="10" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="10" port="0x19"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="11" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="11" port="0x1a"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="12" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="12" port="0x1b"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="13" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="13" port="0x1c"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="14" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="14" port="0x1d"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="15" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="15" port="0x1e"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="16" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="16" port="0x1f"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="17" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="17" port="0x20"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="18" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="18" port="0x21"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="19" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="19" port="0x22"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="20" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="20" port="0x23"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="21" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="21" port="0x24"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="22" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="22" port="0x25"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="23" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="23" port="0x26"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="24" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="24" port="0x27"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="25" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="25" port="0x28"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-pci-bridge"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="usb" index="0" model="piix3-uhci">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="sata" index="0">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <interface type="ethernet"><mac address="fa:16:3e:63:24:e1"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap23321592-59"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </interface><serial type="pty">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <log file="/var/lib/nova/instances/aad5638f-3b4c-43c9-a453-2cd987bcc593/console.log" append="off"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target type="isa-serial" port="0">
Oct 01 14:18:22 compute-0 nova_compute[192698]:         <model name="isa-serial"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       </target>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </serial>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <console type="pty">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <log file="/var/lib/nova/instances/aad5638f-3b4c-43c9-a453-2cd987bcc593/console.log" append="off"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target type="serial" port="0"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </console>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <input type="tablet" bus="usb">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="usb" bus="0" port="1"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </input>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <input type="mouse" bus="ps2"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <listen type="address" address="::"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </graphics>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <video>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model type="virtio" heads="1" primary="yes"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </video>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <stats period="10"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </memballoon>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <rng model="virtio">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <backend model="random">/dev/urandom</backend>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </rng>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   </devices>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]: </domain>
Oct 01 14:18:22 compute-0 nova_compute[192698]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Oct 01 14:18:22 compute-0 nova_compute[192698]: 2025-10-01 14:18:22.077 2 DEBUG nova.virt.libvirt.migration [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] _update_pci_xml output xml=<domain type="kvm">
Oct 01 14:18:22 compute-0 nova_compute[192698]:   <name>instance-00000011</name>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   <uuid>aad5638f-3b4c-43c9-a453-2cd987bcc593</uuid>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   <metadata>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <nova:name>tempest-TestExecuteStrategies-server-624439701</nova:name>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <nova:creationTime>2025-10-01 14:17:28</nova:creationTime>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <nova:flavor name="m1.nano" id="69702c4b-38f2-49d1-96d5-85671652c67e">
Oct 01 14:18:22 compute-0 nova_compute[192698]:         <nova:memory>128</nova:memory>
Oct 01 14:18:22 compute-0 nova_compute[192698]:         <nova:disk>1</nova:disk>
Oct 01 14:18:22 compute-0 nova_compute[192698]:         <nova:swap>0</nova:swap>
Oct 01 14:18:22 compute-0 nova_compute[192698]:         <nova:ephemeral>0</nova:ephemeral>
Oct 01 14:18:22 compute-0 nova_compute[192698]:         <nova:vcpus>1</nova:vcpus>
Oct 01 14:18:22 compute-0 nova_compute[192698]:         <nova:extraSpecs>
Oct 01 14:18:22 compute-0 nova_compute[192698]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Oct 01 14:18:22 compute-0 nova_compute[192698]:         </nova:extraSpecs>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       </nova:flavor>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <nova:image uuid="48696e9b-a20d-4bf6-8ac2-6438fe748ab6">
Oct 01 14:18:22 compute-0 nova_compute[192698]:         <nova:containerFormat>bare</nova:containerFormat>
Oct 01 14:18:22 compute-0 nova_compute[192698]:         <nova:diskFormat>qcow2</nova:diskFormat>
Oct 01 14:18:22 compute-0 nova_compute[192698]:         <nova:minDisk>1</nova:minDisk>
Oct 01 14:18:22 compute-0 nova_compute[192698]:         <nova:minRam>0</nova:minRam>
Oct 01 14:18:22 compute-0 nova_compute[192698]:         <nova:properties>
Oct 01 14:18:22 compute-0 nova_compute[192698]:           <nova:property name="hw_rng_model">virtio</nova:property>
Oct 01 14:18:22 compute-0 nova_compute[192698]:         </nova:properties>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       </nova:image>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <nova:owner>
Oct 01 14:18:22 compute-0 nova_compute[192698]:         <nova:user uuid="f8897741e6ca4770b56d28d05fa3fc42">tempest-TestExecuteStrategies-30131345-project-admin</nova:user>
Oct 01 14:18:22 compute-0 nova_compute[192698]:         <nova:project uuid="d43115e3729442e1b68b749acc0dabc8">tempest-TestExecuteStrategies-30131345</nova:project>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       </nova:owner>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <nova:root type="image" uuid="48696e9b-a20d-4bf6-8ac2-6438fe748ab6"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <nova:ports>
Oct 01 14:18:22 compute-0 nova_compute[192698]:         <nova:port uuid="23321592-5912-475c-80cc-9fe5944d128d">
Oct 01 14:18:22 compute-0 nova_compute[192698]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:         </nova:port>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       </nova:ports>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </nova:instance>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   </metadata>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   <memory unit="KiB">131072</memory>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   <currentMemory unit="KiB">131072</currentMemory>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   <vcpu placement="static">1</vcpu>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   <resource>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <partition>/machine</partition>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   </resource>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   <sysinfo type="smbios">
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <system>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <entry name="manufacturer">RDO</entry>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <entry name="product">OpenStack Compute</entry>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <entry name="serial">aad5638f-3b4c-43c9-a453-2cd987bcc593</entry>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <entry name="uuid">aad5638f-3b4c-43c9-a453-2cd987bcc593</entry>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <entry name="family">Virtual Machine</entry>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </system>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   </sysinfo>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   <os>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <boot dev="hd"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <smbios mode="sysinfo"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   </os>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   <features>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <acpi/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <apic/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <vmcoreinfo state="on"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   </features>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   <cpu mode="host-model" check="partial">
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   </cpu>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   <clock offset="utc">
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <timer name="pit" tickpolicy="delay"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <timer name="hpet" present="no"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   </clock>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   <on_poweroff>destroy</on_poweroff>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   <on_reboot>restart</on_reboot>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   <on_crash>destroy</on_crash>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   <devices>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <disk type="file" device="disk">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <source file="/var/lib/nova/instances/aad5638f-3b4c-43c9-a453-2cd987bcc593/disk"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target dev="vda" bus="virtio"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </disk>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <disk type="file" device="cdrom">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <driver name="qemu" type="raw" cache="none"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <source file="/var/lib/nova/instances/aad5638f-3b4c-43c9-a453-2cd987bcc593/disk.config"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target dev="sda" bus="sata"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <readonly/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </disk>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="0" model="pcie-root"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="1" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="1" port="0x10"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="2" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="2" port="0x11"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="3" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="3" port="0x12"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="4" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="4" port="0x13"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="5" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="5" port="0x14"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="6" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="6" port="0x15"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="7" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="7" port="0x16"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="8" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="8" port="0x17"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="9" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="9" port="0x18"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="10" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="10" port="0x19"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="11" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="11" port="0x1a"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="12" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="12" port="0x1b"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="13" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="13" port="0x1c"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="14" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="14" port="0x1d"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="15" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="15" port="0x1e"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="16" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="16" port="0x1f"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="17" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="17" port="0x20"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="18" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="18" port="0x21"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="19" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="19" port="0x22"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="20" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="20" port="0x23"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="21" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="21" port="0x24"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="22" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="22" port="0x25"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="23" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="23" port="0x26"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="24" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="24" port="0x27"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="25" model="pcie-root-port">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target chassis="25" port="0x28"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model name="pcie-pci-bridge"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="usb" index="0" model="piix3-uhci">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <controller type="sata" index="0">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <interface type="ethernet"><mac address="fa:16:3e:63:24:e1"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap23321592-59"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </interface><serial type="pty">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <log file="/var/lib/nova/instances/aad5638f-3b4c-43c9-a453-2cd987bcc593/console.log" append="off"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target type="isa-serial" port="0">
Oct 01 14:18:22 compute-0 nova_compute[192698]:         <model name="isa-serial"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       </target>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </serial>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <console type="pty">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <log file="/var/lib/nova/instances/aad5638f-3b4c-43c9-a453-2cd987bcc593/console.log" append="off"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <target type="serial" port="0"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </console>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <input type="tablet" bus="usb">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="usb" bus="0" port="1"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </input>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <input type="mouse" bus="ps2"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <listen type="address" address="::"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </graphics>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <video>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <model type="virtio" heads="1" primary="yes"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </video>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <stats period="10"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </memballoon>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     <rng model="virtio">
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <backend model="random">/dev/urandom</backend>
Oct 01 14:18:22 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]:     </rng>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   </devices>
Oct 01 14:18:22 compute-0 nova_compute[192698]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Oct 01 14:18:22 compute-0 nova_compute[192698]: </domain>
Oct 01 14:18:22 compute-0 nova_compute[192698]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Oct 01 14:18:22 compute-0 nova_compute[192698]: 2025-10-01 14:18:22.077 2 DEBUG nova.virt.libvirt.driver [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Oct 01 14:18:22 compute-0 nova_compute[192698]: 2025-10-01 14:18:22.279 2 DEBUG oslo_concurrency.lockutils [req-df6c960c-8208-44b6-842c-bc2713b4f86c req-0d4baeb0-25a5-4bc6-bc9d-91656cdd72c2 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Releasing lock "refresh_cache-aad5638f-3b4c-43c9-a453-2cd987bcc593" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 01 14:18:22 compute-0 nova_compute[192698]: 2025-10-01 14:18:22.557 2 DEBUG nova.virt.libvirt.migration [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Current None elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Oct 01 14:18:22 compute-0 nova_compute[192698]: 2025-10-01 14:18:22.557 2 INFO nova.virt.libvirt.migration [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Increasing downtime to 50 ms after 0 sec elapsed time
Oct 01 14:18:23 compute-0 nova_compute[192698]: 2025-10-01 14:18:23.586 2 INFO nova.virt.libvirt.driver [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Migration running for 1 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Oct 01 14:18:23 compute-0 nova_compute[192698]: 2025-10-01 14:18:23.694 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:18:23 compute-0 nova_compute[192698]: 2025-10-01 14:18:23.751 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:18:24 compute-0 kernel: tap23321592-59 (unregistering): left promiscuous mode
Oct 01 14:18:24 compute-0 NetworkManager[51741]: <info>  [1759328304.0488] device (tap23321592-59): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 01 14:18:24 compute-0 ovn_controller[94909]: 2025-10-01T14:18:24Z|00141|binding|INFO|Releasing lport 23321592-5912-475c-80cc-9fe5944d128d from this chassis (sb_readonly=0)
Oct 01 14:18:24 compute-0 ovn_controller[94909]: 2025-10-01T14:18:24Z|00142|binding|INFO|Setting lport 23321592-5912-475c-80cc-9fe5944d128d down in Southbound
Oct 01 14:18:24 compute-0 nova_compute[192698]: 2025-10-01 14:18:24.065 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:18:24 compute-0 ovn_controller[94909]: 2025-10-01T14:18:24Z|00143|binding|INFO|Removing iface tap23321592-59 ovn-installed in OVS
Oct 01 14:18:24 compute-0 nova_compute[192698]: 2025-10-01 14:18:24.069 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:18:24 compute-0 nova_compute[192698]: 2025-10-01 14:18:24.074 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:18:24 compute-0 nova_compute[192698]: 2025-10-01 14:18:24.074 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:18:24 compute-0 nova_compute[192698]: 2025-10-01 14:18:24.075 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:18:24 compute-0 nova_compute[192698]: 2025-10-01 14:18:24.075 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:18:24 compute-0 nova_compute[192698]: 2025-10-01 14:18:24.075 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:18:24 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:18:24.094 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:63:24:e1 10.100.0.6'], port_security=['fa:16:3e:63:24:e1 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'd71f76a2-379d-402b-b590-797cbe777099'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'aad5638f-3b4c-43c9-a453-2cd987bcc593', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-031a8987-8430-4fb6-a464-01e4dca2fae7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd43115e3729442e1b68b749acc0dabc8', 'neutron:revision_number': '10', 'neutron:security_group_ids': '43a3232d-93b1-43af-a9a3-1fde49b4460d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bd1914da-f1b0-4097-9d6b-24a3870871dc, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], logical_port=23321592-5912-475c-80cc-9fe5944d128d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:18:24 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:18:24.095 103791 INFO neutron.agent.ovn.metadata.agent [-] Port 23321592-5912-475c-80cc-9fe5944d128d in datapath 031a8987-8430-4fb6-a464-01e4dca2fae7 unbound from our chassis
Oct 01 14:18:24 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:18:24.096 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 031a8987-8430-4fb6-a464-01e4dca2fae7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 01 14:18:24 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:18:24.097 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[ed704f6a-6d4e-4870-83bf-25f74e8c1bca]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:18:24 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:18:24.098 103791 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7 namespace which is not needed anymore
Oct 01 14:18:24 compute-0 nova_compute[192698]: 2025-10-01 14:18:24.101 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:18:24 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000011.scope: Deactivated successfully.
Oct 01 14:18:24 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000011.scope: Consumed 15.791s CPU time.
Oct 01 14:18:24 compute-0 systemd-machined[152704]: Machine qemu-12-instance-00000011 terminated.
Oct 01 14:18:24 compute-0 neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7[221596]: [NOTICE]   (221600) : haproxy version is 3.0.5-8e879a5
Oct 01 14:18:24 compute-0 neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7[221596]: [NOTICE]   (221600) : path to executable is /usr/sbin/haproxy
Oct 01 14:18:24 compute-0 neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7[221596]: [WARNING]  (221600) : Exiting Master process...
Oct 01 14:18:24 compute-0 podman[221914]: 2025-10-01 14:18:24.247027401 +0000 UTC m=+0.044059344 container kill c23feed40c642ca91a4f23527f3c26744f6482bba7ddf7a250ce0fdf672852d0 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Oct 01 14:18:24 compute-0 neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7[221596]: [ALERT]    (221600) : Current worker (221602) exited with code 143 (Terminated)
Oct 01 14:18:24 compute-0 neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7[221596]: [WARNING]  (221600) : All workers exited. Exiting... (0)
Oct 01 14:18:24 compute-0 systemd[1]: libpod-c23feed40c642ca91a4f23527f3c26744f6482bba7ddf7a250ce0fdf672852d0.scope: Deactivated successfully.
Oct 01 14:18:24 compute-0 nova_compute[192698]: 2025-10-01 14:18:24.255 2 DEBUG nova.compute.manager [req-b3dc5ec0-a194-491d-9165-ffdd5e198d31 req-e7e9ba47-2b0f-4527-bc42-2088b37dc0c5 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Received event network-vif-unplugged-23321592-5912-475c-80cc-9fe5944d128d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:18:24 compute-0 nova_compute[192698]: 2025-10-01 14:18:24.255 2 DEBUG oslo_concurrency.lockutils [req-b3dc5ec0-a194-491d-9165-ffdd5e198d31 req-e7e9ba47-2b0f-4527-bc42-2088b37dc0c5 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "aad5638f-3b4c-43c9-a453-2cd987bcc593-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:18:24 compute-0 nova_compute[192698]: 2025-10-01 14:18:24.256 2 DEBUG oslo_concurrency.lockutils [req-b3dc5ec0-a194-491d-9165-ffdd5e198d31 req-e7e9ba47-2b0f-4527-bc42-2088b37dc0c5 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "aad5638f-3b4c-43c9-a453-2cd987bcc593-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:18:24 compute-0 nova_compute[192698]: 2025-10-01 14:18:24.256 2 DEBUG oslo_concurrency.lockutils [req-b3dc5ec0-a194-491d-9165-ffdd5e198d31 req-e7e9ba47-2b0f-4527-bc42-2088b37dc0c5 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "aad5638f-3b4c-43c9-a453-2cd987bcc593-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:18:24 compute-0 conmon[221596]: conmon c23feed40c642ca91a4f <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-c23feed40c642ca91a4f23527f3c26744f6482bba7ddf7a250ce0fdf672852d0.scope/container/memory.events
Oct 01 14:18:24 compute-0 nova_compute[192698]: 2025-10-01 14:18:24.256 2 DEBUG nova.compute.manager [req-b3dc5ec0-a194-491d-9165-ffdd5e198d31 req-e7e9ba47-2b0f-4527-bc42-2088b37dc0c5 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] No waiting events found dispatching network-vif-unplugged-23321592-5912-475c-80cc-9fe5944d128d pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:18:24 compute-0 nova_compute[192698]: 2025-10-01 14:18:24.257 2 DEBUG nova.compute.manager [req-b3dc5ec0-a194-491d-9165-ffdd5e198d31 req-e7e9ba47-2b0f-4527-bc42-2088b37dc0c5 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Received event network-vif-unplugged-23321592-5912-475c-80cc-9fe5944d128d for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 01 14:18:24 compute-0 podman[221934]: 2025-10-01 14:18:24.331946132 +0000 UTC m=+0.051769534 container died c23feed40c642ca91a4f23527f3c26744f6482bba7ddf7a250ce0fdf672852d0 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4)
Oct 01 14:18:24 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c23feed40c642ca91a4f23527f3c26744f6482bba7ddf7a250ce0fdf672852d0-userdata-shm.mount: Deactivated successfully.
Oct 01 14:18:24 compute-0 systemd[1]: var-lib-containers-storage-overlay-ac1ed7bcb47da107e4572e7d3f71d82895e1c01344592b2461070e85d70ef618-merged.mount: Deactivated successfully.
Oct 01 14:18:24 compute-0 podman[221934]: 2025-10-01 14:18:24.366937029 +0000 UTC m=+0.086760411 container cleanup c23feed40c642ca91a4f23527f3c26744f6482bba7ddf7a250ce0fdf672852d0 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS)
Oct 01 14:18:24 compute-0 systemd[1]: libpod-conmon-c23feed40c642ca91a4f23527f3c26744f6482bba7ddf7a250ce0fdf672852d0.scope: Deactivated successfully.
Oct 01 14:18:24 compute-0 podman[221939]: 2025-10-01 14:18:24.386106469 +0000 UTC m=+0.073042940 container remove c23feed40c642ca91a4f23527f3c26744f6482bba7ddf7a250ce0fdf672852d0 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Oct 01 14:18:24 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:18:24.394 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[2c8fa7d9-0933-4d5d-b76f-2d04c68add59]: (4, ("Wed Oct  1 02:18:24 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7 (c23feed40c642ca91a4f23527f3c26744f6482bba7ddf7a250ce0fdf672852d0)\nc23feed40c642ca91a4f23527f3c26744f6482bba7ddf7a250ce0fdf672852d0\nWed Oct  1 02:18:24 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7 (c23feed40c642ca91a4f23527f3c26744f6482bba7ddf7a250ce0fdf672852d0)\nc23feed40c642ca91a4f23527f3c26744f6482bba7ddf7a250ce0fdf672852d0\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:18:24 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:18:24.397 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[d5f900d6-304a-444f-859b-80238fafc912]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:18:24 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:18:24.397 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:18:24 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:18:24.398 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[e66726c9-4957-44f1-b78d-0d09c23e41b2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:18:24 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:18:24.399 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap031a8987-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:18:24 compute-0 kernel: tap031a8987-80: left promiscuous mode
Oct 01 14:18:24 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:18:24.420 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[71a18e06-f31a-4549-a8a5-c53664a8ac18]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:18:24 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:18:24.447 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[b5c1d24d-a470-4ce2-9a1f-9462cf0508aa]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:18:24 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:18:24.448 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[44fa2513-867f-4a28-b211-3d088c735412]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:18:24 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:18:24.464 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[ced6e676-c2ff-4a1a-a915-a4e901bb00d2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 461229, 'reachable_time': 24955, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221982, 'error': None, 'target': 'ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:18:24 compute-0 systemd[1]: run-netns-ovnmeta\x2d031a8987\x2d8430\x2d4fb6\x2da464\x2d01e4dca2fae7.mount: Deactivated successfully.
Oct 01 14:18:24 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:18:24.470 103910 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Oct 01 14:18:24 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:18:24.470 103910 DEBUG oslo.privsep.daemon [-] privsep: reply[ec7359f4-cbaf-420e-a5c4-ef4a6a494a87]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:18:24 compute-0 nova_compute[192698]: 2025-10-01 14:18:24.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:18:24 compute-0 nova_compute[192698]: 2025-10-01 14:18:24.527 2 DEBUG nova.virt.libvirt.driver [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Migrate API has completed _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11182
Oct 01 14:18:24 compute-0 nova_compute[192698]: 2025-10-01 14:18:24.527 2 DEBUG nova.virt.libvirt.driver [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Migration operation thread has finished _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11230
Oct 01 14:18:24 compute-0 nova_compute[192698]: 2025-10-01 14:18:24.528 2 DEBUG nova.virt.libvirt.driver [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Oct 01 14:18:24 compute-0 nova_compute[192698]: 2025-10-01 14:18:24.528 2 DEBUG nova.virt.libvirt.guest [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Oct 01 14:18:24 compute-0 nova_compute[192698]: 2025-10-01 14:18:24.528 2 INFO nova.virt.libvirt.driver [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Migration operation has completed
Oct 01 14:18:24 compute-0 nova_compute[192698]: 2025-10-01 14:18:24.528 2 INFO nova.compute.manager [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] _post_live_migration() is started..
Oct 01 14:18:24 compute-0 nova_compute[192698]: 2025-10-01 14:18:24.539 2 WARNING neutronclient.v2_0.client [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:18:24 compute-0 nova_compute[192698]: 2025-10-01 14:18:24.540 2 WARNING neutronclient.v2_0.client [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:18:24 compute-0 nova_compute[192698]: 2025-10-01 14:18:24.926 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:18:25 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:18:25.431 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=10cf9814-09fa-4bad-879a-270f9b64eda3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:18:25 compute-0 nova_compute[192698]: 2025-10-01 14:18:25.627 2 DEBUG nova.network.neutron [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Activated binding for port 23321592-5912-475c-80cc-9fe5944d128d and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Oct 01 14:18:25 compute-0 nova_compute[192698]: 2025-10-01 14:18:25.629 2 DEBUG nova.compute.manager [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "23321592-5912-475c-80cc-9fe5944d128d", "address": "fa:16:3e:63:24:e1", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23321592-59", "ovs_interfaceid": "23321592-5912-475c-80cc-9fe5944d128d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10059
Oct 01 14:18:25 compute-0 nova_compute[192698]: 2025-10-01 14:18:25.630 2 DEBUG nova.virt.libvirt.vif [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-10-01T14:17:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-624439701',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-624439701',id=17,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-01T14:17:33Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d43115e3729442e1b68b749acc0dabc8',ramdisk_id='',reservation_id='r-l3gig426',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-30131345',owner_user_name='tempest-TestExecuteStrategies-30131345-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-01T14:18:01Z,user_data=None,user_id='f8897741e6ca4770b56d28d05fa3fc42',uuid=aad5638f-3b4c-43c9-a453-2cd987bcc593,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "23321592-5912-475c-80cc-9fe5944d128d", "address": "fa:16:3e:63:24:e1", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23321592-59", "ovs_interfaceid": "23321592-5912-475c-80cc-9fe5944d128d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 01 14:18:25 compute-0 nova_compute[192698]: 2025-10-01 14:18:25.631 2 DEBUG nova.network.os_vif_util [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Converting VIF {"id": "23321592-5912-475c-80cc-9fe5944d128d", "address": "fa:16:3e:63:24:e1", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap23321592-59", "ovs_interfaceid": "23321592-5912-475c-80cc-9fe5944d128d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:18:25 compute-0 nova_compute[192698]: 2025-10-01 14:18:25.632 2 DEBUG nova.network.os_vif_util [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:63:24:e1,bridge_name='br-int',has_traffic_filtering=True,id=23321592-5912-475c-80cc-9fe5944d128d,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap23321592-59') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:18:25 compute-0 nova_compute[192698]: 2025-10-01 14:18:25.632 2 DEBUG os_vif [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:63:24:e1,bridge_name='br-int',has_traffic_filtering=True,id=23321592-5912-475c-80cc-9fe5944d128d,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap23321592-59') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 01 14:18:25 compute-0 nova_compute[192698]: 2025-10-01 14:18:25.635 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:18:25 compute-0 nova_compute[192698]: 2025-10-01 14:18:25.635 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap23321592-59, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:18:25 compute-0 nova_compute[192698]: 2025-10-01 14:18:25.637 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:18:25 compute-0 nova_compute[192698]: 2025-10-01 14:18:25.639 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:18:25 compute-0 nova_compute[192698]: 2025-10-01 14:18:25.640 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:18:25 compute-0 nova_compute[192698]: 2025-10-01 14:18:25.640 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=c5a21c36-7c10-4df0-b8fa-cc37e99531cf) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:18:25 compute-0 nova_compute[192698]: 2025-10-01 14:18:25.641 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:18:25 compute-0 nova_compute[192698]: 2025-10-01 14:18:25.642 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:18:25 compute-0 nova_compute[192698]: 2025-10-01 14:18:25.645 2 INFO os_vif [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:63:24:e1,bridge_name='br-int',has_traffic_filtering=True,id=23321592-5912-475c-80cc-9fe5944d128d,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap23321592-59')
Oct 01 14:18:25 compute-0 nova_compute[192698]: 2025-10-01 14:18:25.645 2 DEBUG oslo_concurrency.lockutils [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:18:25 compute-0 nova_compute[192698]: 2025-10-01 14:18:25.645 2 DEBUG oslo_concurrency.lockutils [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:18:25 compute-0 nova_compute[192698]: 2025-10-01 14:18:25.645 2 DEBUG oslo_concurrency.lockutils [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:18:25 compute-0 nova_compute[192698]: 2025-10-01 14:18:25.646 2 DEBUG nova.compute.manager [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10082
Oct 01 14:18:25 compute-0 nova_compute[192698]: 2025-10-01 14:18:25.646 2 INFO nova.virt.libvirt.driver [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Deleting instance files /var/lib/nova/instances/aad5638f-3b4c-43c9-a453-2cd987bcc593_del
Oct 01 14:18:25 compute-0 nova_compute[192698]: 2025-10-01 14:18:25.647 2 INFO nova.virt.libvirt.driver [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Deletion of /var/lib/nova/instances/aad5638f-3b4c-43c9-a453-2cd987bcc593_del complete
Oct 01 14:18:26 compute-0 podman[221983]: 2025-10-01 14:18:26.17047267 +0000 UTC m=+0.082958988 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 01 14:18:26 compute-0 nova_compute[192698]: 2025-10-01 14:18:26.334 2 DEBUG nova.compute.manager [req-6bf74dd7-7bf4-4c83-af43-acfbdf8ffec7 req-2159057f-ac6d-486c-a222-beb9f807572f 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Received event network-vif-plugged-23321592-5912-475c-80cc-9fe5944d128d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:18:26 compute-0 nova_compute[192698]: 2025-10-01 14:18:26.335 2 DEBUG oslo_concurrency.lockutils [req-6bf74dd7-7bf4-4c83-af43-acfbdf8ffec7 req-2159057f-ac6d-486c-a222-beb9f807572f 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "aad5638f-3b4c-43c9-a453-2cd987bcc593-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:18:26 compute-0 nova_compute[192698]: 2025-10-01 14:18:26.335 2 DEBUG oslo_concurrency.lockutils [req-6bf74dd7-7bf4-4c83-af43-acfbdf8ffec7 req-2159057f-ac6d-486c-a222-beb9f807572f 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "aad5638f-3b4c-43c9-a453-2cd987bcc593-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:18:26 compute-0 nova_compute[192698]: 2025-10-01 14:18:26.335 2 DEBUG oslo_concurrency.lockutils [req-6bf74dd7-7bf4-4c83-af43-acfbdf8ffec7 req-2159057f-ac6d-486c-a222-beb9f807572f 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "aad5638f-3b4c-43c9-a453-2cd987bcc593-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:18:26 compute-0 nova_compute[192698]: 2025-10-01 14:18:26.335 2 DEBUG nova.compute.manager [req-6bf74dd7-7bf4-4c83-af43-acfbdf8ffec7 req-2159057f-ac6d-486c-a222-beb9f807572f 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] No waiting events found dispatching network-vif-plugged-23321592-5912-475c-80cc-9fe5944d128d pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:18:26 compute-0 nova_compute[192698]: 2025-10-01 14:18:26.335 2 WARNING nova.compute.manager [req-6bf74dd7-7bf4-4c83-af43-acfbdf8ffec7 req-2159057f-ac6d-486c-a222-beb9f807572f 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Received unexpected event network-vif-plugged-23321592-5912-475c-80cc-9fe5944d128d for instance with vm_state active and task_state migrating.
Oct 01 14:18:26 compute-0 nova_compute[192698]: 2025-10-01 14:18:26.336 2 DEBUG nova.compute.manager [req-6bf74dd7-7bf4-4c83-af43-acfbdf8ffec7 req-2159057f-ac6d-486c-a222-beb9f807572f 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Received event network-vif-unplugged-23321592-5912-475c-80cc-9fe5944d128d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:18:26 compute-0 nova_compute[192698]: 2025-10-01 14:18:26.336 2 DEBUG oslo_concurrency.lockutils [req-6bf74dd7-7bf4-4c83-af43-acfbdf8ffec7 req-2159057f-ac6d-486c-a222-beb9f807572f 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "aad5638f-3b4c-43c9-a453-2cd987bcc593-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:18:26 compute-0 nova_compute[192698]: 2025-10-01 14:18:26.336 2 DEBUG oslo_concurrency.lockutils [req-6bf74dd7-7bf4-4c83-af43-acfbdf8ffec7 req-2159057f-ac6d-486c-a222-beb9f807572f 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "aad5638f-3b4c-43c9-a453-2cd987bcc593-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:18:26 compute-0 nova_compute[192698]: 2025-10-01 14:18:26.336 2 DEBUG oslo_concurrency.lockutils [req-6bf74dd7-7bf4-4c83-af43-acfbdf8ffec7 req-2159057f-ac6d-486c-a222-beb9f807572f 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "aad5638f-3b4c-43c9-a453-2cd987bcc593-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:18:26 compute-0 nova_compute[192698]: 2025-10-01 14:18:26.336 2 DEBUG nova.compute.manager [req-6bf74dd7-7bf4-4c83-af43-acfbdf8ffec7 req-2159057f-ac6d-486c-a222-beb9f807572f 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] No waiting events found dispatching network-vif-unplugged-23321592-5912-475c-80cc-9fe5944d128d pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:18:26 compute-0 nova_compute[192698]: 2025-10-01 14:18:26.336 2 DEBUG nova.compute.manager [req-6bf74dd7-7bf4-4c83-af43-acfbdf8ffec7 req-2159057f-ac6d-486c-a222-beb9f807572f 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Received event network-vif-unplugged-23321592-5912-475c-80cc-9fe5944d128d for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 01 14:18:26 compute-0 nova_compute[192698]: 2025-10-01 14:18:26.336 2 DEBUG nova.compute.manager [req-6bf74dd7-7bf4-4c83-af43-acfbdf8ffec7 req-2159057f-ac6d-486c-a222-beb9f807572f 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Received event network-vif-unplugged-23321592-5912-475c-80cc-9fe5944d128d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:18:26 compute-0 nova_compute[192698]: 2025-10-01 14:18:26.337 2 DEBUG oslo_concurrency.lockutils [req-6bf74dd7-7bf4-4c83-af43-acfbdf8ffec7 req-2159057f-ac6d-486c-a222-beb9f807572f 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "aad5638f-3b4c-43c9-a453-2cd987bcc593-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:18:26 compute-0 nova_compute[192698]: 2025-10-01 14:18:26.337 2 DEBUG oslo_concurrency.lockutils [req-6bf74dd7-7bf4-4c83-af43-acfbdf8ffec7 req-2159057f-ac6d-486c-a222-beb9f807572f 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "aad5638f-3b4c-43c9-a453-2cd987bcc593-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:18:26 compute-0 nova_compute[192698]: 2025-10-01 14:18:26.337 2 DEBUG oslo_concurrency.lockutils [req-6bf74dd7-7bf4-4c83-af43-acfbdf8ffec7 req-2159057f-ac6d-486c-a222-beb9f807572f 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "aad5638f-3b4c-43c9-a453-2cd987bcc593-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:18:26 compute-0 nova_compute[192698]: 2025-10-01 14:18:26.337 2 DEBUG nova.compute.manager [req-6bf74dd7-7bf4-4c83-af43-acfbdf8ffec7 req-2159057f-ac6d-486c-a222-beb9f807572f 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] No waiting events found dispatching network-vif-unplugged-23321592-5912-475c-80cc-9fe5944d128d pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:18:26 compute-0 nova_compute[192698]: 2025-10-01 14:18:26.337 2 DEBUG nova.compute.manager [req-6bf74dd7-7bf4-4c83-af43-acfbdf8ffec7 req-2159057f-ac6d-486c-a222-beb9f807572f 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Received event network-vif-unplugged-23321592-5912-475c-80cc-9fe5944d128d for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 01 14:18:26 compute-0 nova_compute[192698]: 2025-10-01 14:18:26.337 2 DEBUG nova.compute.manager [req-6bf74dd7-7bf4-4c83-af43-acfbdf8ffec7 req-2159057f-ac6d-486c-a222-beb9f807572f 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Received event network-vif-plugged-23321592-5912-475c-80cc-9fe5944d128d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:18:26 compute-0 nova_compute[192698]: 2025-10-01 14:18:26.338 2 DEBUG oslo_concurrency.lockutils [req-6bf74dd7-7bf4-4c83-af43-acfbdf8ffec7 req-2159057f-ac6d-486c-a222-beb9f807572f 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "aad5638f-3b4c-43c9-a453-2cd987bcc593-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:18:26 compute-0 nova_compute[192698]: 2025-10-01 14:18:26.338 2 DEBUG oslo_concurrency.lockutils [req-6bf74dd7-7bf4-4c83-af43-acfbdf8ffec7 req-2159057f-ac6d-486c-a222-beb9f807572f 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "aad5638f-3b4c-43c9-a453-2cd987bcc593-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:18:26 compute-0 nova_compute[192698]: 2025-10-01 14:18:26.338 2 DEBUG oslo_concurrency.lockutils [req-6bf74dd7-7bf4-4c83-af43-acfbdf8ffec7 req-2159057f-ac6d-486c-a222-beb9f807572f 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "aad5638f-3b4c-43c9-a453-2cd987bcc593-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:18:26 compute-0 nova_compute[192698]: 2025-10-01 14:18:26.338 2 DEBUG nova.compute.manager [req-6bf74dd7-7bf4-4c83-af43-acfbdf8ffec7 req-2159057f-ac6d-486c-a222-beb9f807572f 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] No waiting events found dispatching network-vif-plugged-23321592-5912-475c-80cc-9fe5944d128d pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:18:26 compute-0 nova_compute[192698]: 2025-10-01 14:18:26.338 2 WARNING nova.compute.manager [req-6bf74dd7-7bf4-4c83-af43-acfbdf8ffec7 req-2159057f-ac6d-486c-a222-beb9f807572f 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Received unexpected event network-vif-plugged-23321592-5912-475c-80cc-9fe5944d128d for instance with vm_state active and task_state migrating.
Oct 01 14:18:26 compute-0 nova_compute[192698]: 2025-10-01 14:18:26.338 2 DEBUG nova.compute.manager [req-6bf74dd7-7bf4-4c83-af43-acfbdf8ffec7 req-2159057f-ac6d-486c-a222-beb9f807572f 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Received event network-vif-plugged-23321592-5912-475c-80cc-9fe5944d128d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:18:26 compute-0 nova_compute[192698]: 2025-10-01 14:18:26.338 2 DEBUG oslo_concurrency.lockutils [req-6bf74dd7-7bf4-4c83-af43-acfbdf8ffec7 req-2159057f-ac6d-486c-a222-beb9f807572f 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "aad5638f-3b4c-43c9-a453-2cd987bcc593-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:18:26 compute-0 nova_compute[192698]: 2025-10-01 14:18:26.339 2 DEBUG oslo_concurrency.lockutils [req-6bf74dd7-7bf4-4c83-af43-acfbdf8ffec7 req-2159057f-ac6d-486c-a222-beb9f807572f 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "aad5638f-3b4c-43c9-a453-2cd987bcc593-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:18:26 compute-0 nova_compute[192698]: 2025-10-01 14:18:26.339 2 DEBUG oslo_concurrency.lockutils [req-6bf74dd7-7bf4-4c83-af43-acfbdf8ffec7 req-2159057f-ac6d-486c-a222-beb9f807572f 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "aad5638f-3b4c-43c9-a453-2cd987bcc593-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:18:26 compute-0 nova_compute[192698]: 2025-10-01 14:18:26.339 2 DEBUG nova.compute.manager [req-6bf74dd7-7bf4-4c83-af43-acfbdf8ffec7 req-2159057f-ac6d-486c-a222-beb9f807572f 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] No waiting events found dispatching network-vif-plugged-23321592-5912-475c-80cc-9fe5944d128d pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:18:26 compute-0 nova_compute[192698]: 2025-10-01 14:18:26.339 2 WARNING nova.compute.manager [req-6bf74dd7-7bf4-4c83-af43-acfbdf8ffec7 req-2159057f-ac6d-486c-a222-beb9f807572f 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Received unexpected event network-vif-plugged-23321592-5912-475c-80cc-9fe5944d128d for instance with vm_state active and task_state migrating.
Oct 01 14:18:28 compute-0 nova_compute[192698]: 2025-10-01 14:18:28.753 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:18:28 compute-0 nova_compute[192698]: 2025-10-01 14:18:28.924 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:18:28 compute-0 nova_compute[192698]: 2025-10-01 14:18:28.925 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 01 14:18:29 compute-0 podman[203144]: time="2025-10-01T14:18:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:18:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:18:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:18:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:18:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3023 "" "Go-http-client/1.1"
Oct 01 14:18:30 compute-0 nova_compute[192698]: 2025-10-01 14:18:30.642 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:18:31 compute-0 openstack_network_exporter[205307]: ERROR   14:18:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:18:31 compute-0 openstack_network_exporter[205307]: ERROR   14:18:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:18:31 compute-0 openstack_network_exporter[205307]: ERROR   14:18:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:18:31 compute-0 openstack_network_exporter[205307]: ERROR   14:18:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:18:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:18:31 compute-0 openstack_network_exporter[205307]: ERROR   14:18:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:18:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:18:33 compute-0 nova_compute[192698]: 2025-10-01 14:18:33.807 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:18:35 compute-0 nova_compute[192698]: 2025-10-01 14:18:35.645 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:18:36 compute-0 nova_compute[192698]: 2025-10-01 14:18:36.189 2 DEBUG oslo_concurrency.lockutils [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "aad5638f-3b4c-43c9-a453-2cd987bcc593-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:18:36 compute-0 nova_compute[192698]: 2025-10-01 14:18:36.190 2 DEBUG oslo_concurrency.lockutils [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "aad5638f-3b4c-43c9-a453-2cd987bcc593-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:18:36 compute-0 nova_compute[192698]: 2025-10-01 14:18:36.191 2 DEBUG oslo_concurrency.lockutils [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "aad5638f-3b4c-43c9-a453-2cd987bcc593-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:18:36 compute-0 nova_compute[192698]: 2025-10-01 14:18:36.707 2 DEBUG oslo_concurrency.lockutils [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:18:36 compute-0 nova_compute[192698]: 2025-10-01 14:18:36.708 2 DEBUG oslo_concurrency.lockutils [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:18:36 compute-0 nova_compute[192698]: 2025-10-01 14:18:36.708 2 DEBUG oslo_concurrency.lockutils [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:18:36 compute-0 nova_compute[192698]: 2025-10-01 14:18:36.709 2 DEBUG nova.compute.resource_tracker [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 01 14:18:36 compute-0 nova_compute[192698]: 2025-10-01 14:18:36.914 2 WARNING nova.virt.libvirt.driver [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:18:36 compute-0 nova_compute[192698]: 2025-10-01 14:18:36.915 2 DEBUG oslo_concurrency.processutils [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:18:36 compute-0 nova_compute[192698]: 2025-10-01 14:18:36.944 2 DEBUG oslo_concurrency.processutils [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.028s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:18:36 compute-0 nova_compute[192698]: 2025-10-01 14:18:36.945 2 DEBUG nova.compute.resource_tracker [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5817MB free_disk=73.30319213867188GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 01 14:18:36 compute-0 nova_compute[192698]: 2025-10-01 14:18:36.945 2 DEBUG oslo_concurrency.lockutils [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:18:36 compute-0 nova_compute[192698]: 2025-10-01 14:18:36.946 2 DEBUG oslo_concurrency.lockutils [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:18:37 compute-0 nova_compute[192698]: 2025-10-01 14:18:37.968 2 DEBUG nova.compute.resource_tracker [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Migration for instance aad5638f-3b4c-43c9-a453-2cd987bcc593 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Oct 01 14:18:38 compute-0 podman[222010]: 2025-10-01 14:18:38.161372109 +0000 UTC m=+0.073249465 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct 01 14:18:38 compute-0 podman[222011]: 2025-10-01 14:18:38.20570711 +0000 UTC m=+0.112201360 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20250930)
Oct 01 14:18:38 compute-0 nova_compute[192698]: 2025-10-01 14:18:38.476 2 DEBUG nova.compute.resource_tracker [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Oct 01 14:18:38 compute-0 nova_compute[192698]: 2025-10-01 14:18:38.503 2 DEBUG nova.compute.resource_tracker [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Migration 2925e0b3-6229-4941-a681-1afe3691fe7f is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Oct 01 14:18:38 compute-0 nova_compute[192698]: 2025-10-01 14:18:38.503 2 DEBUG nova.compute.resource_tracker [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 01 14:18:38 compute-0 nova_compute[192698]: 2025-10-01 14:18:38.503 2 DEBUG nova.compute.resource_tracker [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:18:36 up  1:17,  0 user,  load average: 0.23, 0.24, 0.34\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 01 14:18:38 compute-0 nova_compute[192698]: 2025-10-01 14:18:38.540 2 DEBUG nova.compute.provider_tree [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:18:38 compute-0 nova_compute[192698]: 2025-10-01 14:18:38.810 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:18:39 compute-0 nova_compute[192698]: 2025-10-01 14:18:39.049 2 DEBUG nova.scheduler.client.report [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:18:39 compute-0 nova_compute[192698]: 2025-10-01 14:18:39.561 2 DEBUG nova.compute.resource_tracker [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 01 14:18:39 compute-0 nova_compute[192698]: 2025-10-01 14:18:39.562 2 DEBUG oslo_concurrency.lockutils [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.616s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:18:39 compute-0 nova_compute[192698]: 2025-10-01 14:18:39.584 2 INFO nova.compute.manager [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Oct 01 14:18:39 compute-0 nova_compute[192698]: 2025-10-01 14:18:39.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:18:39 compute-0 nova_compute[192698]: 2025-10-01 14:18:39.926 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Oct 01 14:18:40 compute-0 nova_compute[192698]: 2025-10-01 14:18:40.433 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Oct 01 14:18:40 compute-0 nova_compute[192698]: 2025-10-01 14:18:40.647 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:18:40 compute-0 nova_compute[192698]: 2025-10-01 14:18:40.680 2 INFO nova.scheduler.client.report [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Deleted allocation for migration 2925e0b3-6229-4941-a681-1afe3691fe7f
Oct 01 14:18:40 compute-0 nova_compute[192698]: 2025-10-01 14:18:40.681 2 DEBUG nova.virt.libvirt.driver [None req-d6f394be-7c93-4bd5-9042-206631161fbf a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: aad5638f-3b4c-43c9-a453-2cd987bcc593] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Oct 01 14:18:43 compute-0 nova_compute[192698]: 2025-10-01 14:18:43.812 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:18:45 compute-0 nova_compute[192698]: 2025-10-01 14:18:45.649 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:18:45 compute-0 podman[222056]: 2025-10-01 14:18:45.756563675 +0000 UTC m=+0.077541021 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, version=9.6, config_id=edpm, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct 01 14:18:48 compute-0 nova_compute[192698]: 2025-10-01 14:18:48.865 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:18:50 compute-0 nova_compute[192698]: 2025-10-01 14:18:50.651 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:18:51 compute-0 podman[222078]: 2025-10-01 14:18:51.177280213 +0000 UTC m=+0.072652709 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS)
Oct 01 14:18:51 compute-0 podman[222077]: 2025-10-01 14:18:51.184728655 +0000 UTC m=+0.091795787 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Oct 01 14:18:51 compute-0 nova_compute[192698]: 2025-10-01 14:18:51.218 2 DEBUG nova.compute.manager [None req-967900a7-b3db-46b4-8c1c-3ca86393644c 1b0ba8d8c771490ab1005529976fdb7e 9dacac6049d34f02846f752af09ae16f - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider ee1e54f5-453b-4949-a499-9a192f03b8f0 in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:631
Oct 01 14:18:51 compute-0 nova_compute[192698]: 2025-10-01 14:18:51.294 2 DEBUG nova.compute.provider_tree [None req-967900a7-b3db-46b4-8c1c-3ca86393644c 1b0ba8d8c771490ab1005529976fdb7e 9dacac6049d34f02846f752af09ae16f - - default default] Updating resource provider ee1e54f5-453b-4949-a499-9a192f03b8f0 generation from 21 to 23 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Oct 01 14:18:53 compute-0 nova_compute[192698]: 2025-10-01 14:18:53.903 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:18:55 compute-0 nova_compute[192698]: 2025-10-01 14:18:55.653 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:18:57 compute-0 podman[222116]: 2025-10-01 14:18:57.201440725 +0000 UTC m=+0.109835286 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 01 14:18:58 compute-0 nova_compute[192698]: 2025-10-01 14:18:58.904 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:18:59 compute-0 podman[203144]: time="2025-10-01T14:18:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:18:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:18:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:18:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:18:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3020 "" "Go-http-client/1.1"
Oct 01 14:19:00 compute-0 nova_compute[192698]: 2025-10-01 14:19:00.656 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:19:01 compute-0 openstack_network_exporter[205307]: ERROR   14:19:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:19:01 compute-0 openstack_network_exporter[205307]: ERROR   14:19:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:19:01 compute-0 openstack_network_exporter[205307]: ERROR   14:19:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:19:01 compute-0 openstack_network_exporter[205307]: ERROR   14:19:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:19:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:19:01 compute-0 openstack_network_exporter[205307]: ERROR   14:19:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:19:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:19:03 compute-0 nova_compute[192698]: 2025-10-01 14:19:03.907 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:19:05 compute-0 nova_compute[192698]: 2025-10-01 14:19:05.659 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:19:08 compute-0 nova_compute[192698]: 2025-10-01 14:19:08.910 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:19:09 compute-0 podman[222141]: 2025-10-01 14:19:09.196523598 +0000 UTC m=+0.098156909 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, org.label-schema.build-date=20250930)
Oct 01 14:19:09 compute-0 podman[222142]: 2025-10-01 14:19:09.238930517 +0000 UTC m=+0.131356509 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible)
Oct 01 14:19:10 compute-0 nova_compute[192698]: 2025-10-01 14:19:10.662 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:19:13 compute-0 nova_compute[192698]: 2025-10-01 14:19:13.912 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:19:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:19:14.268 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:19:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:19:14.269 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:19:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:19:14.269 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:19:15 compute-0 nova_compute[192698]: 2025-10-01 14:19:15.664 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:19:16 compute-0 podman[222186]: 2025-10-01 14:19:16.139885077 +0000 UTC m=+0.058073144 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, config_id=edpm, distribution-scope=public, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, managed_by=edpm_ansible, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., version=9.6, io.buildah.version=1.33.7)
Oct 01 14:19:17 compute-0 nova_compute[192698]: 2025-10-01 14:19:17.434 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:19:17 compute-0 nova_compute[192698]: 2025-10-01 14:19:17.948 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:19:17 compute-0 nova_compute[192698]: 2025-10-01 14:19:17.949 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:19:17 compute-0 nova_compute[192698]: 2025-10-01 14:19:17.949 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:19:17 compute-0 nova_compute[192698]: 2025-10-01 14:19:17.949 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 01 14:19:18 compute-0 nova_compute[192698]: 2025-10-01 14:19:18.133 2 WARNING nova.virt.libvirt.driver [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:19:18 compute-0 nova_compute[192698]: 2025-10-01 14:19:18.134 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:19:18 compute-0 nova_compute[192698]: 2025-10-01 14:19:18.155 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.021s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:19:18 compute-0 nova_compute[192698]: 2025-10-01 14:19:18.156 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5816MB free_disk=73.30319213867188GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 01 14:19:18 compute-0 nova_compute[192698]: 2025-10-01 14:19:18.156 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:19:18 compute-0 nova_compute[192698]: 2025-10-01 14:19:18.157 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:19:18 compute-0 nova_compute[192698]: 2025-10-01 14:19:18.914 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:19:19 compute-0 nova_compute[192698]: 2025-10-01 14:19:19.252 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 01 14:19:19 compute-0 nova_compute[192698]: 2025-10-01 14:19:19.253 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:19:18 up  1:18,  0 user,  load average: 0.21, 0.24, 0.33\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 01 14:19:19 compute-0 nova_compute[192698]: 2025-10-01 14:19:19.312 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:19:19 compute-0 nova_compute[192698]: 2025-10-01 14:19:19.820 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:19:20 compute-0 nova_compute[192698]: 2025-10-01 14:19:20.330 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 01 14:19:20 compute-0 nova_compute[192698]: 2025-10-01 14:19:20.331 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.174s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:19:20 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:19:20.469 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'e2:3f:3c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:1d:a6:67:ed:e6'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:19:20 compute-0 nova_compute[192698]: 2025-10-01 14:19:20.470 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:19:20 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:19:20.471 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 01 14:19:20 compute-0 nova_compute[192698]: 2025-10-01 14:19:20.665 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:19:21 compute-0 nova_compute[192698]: 2025-10-01 14:19:21.822 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:19:21 compute-0 nova_compute[192698]: 2025-10-01 14:19:21.822 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:19:21 compute-0 nova_compute[192698]: 2025-10-01 14:19:21.914 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:19:22 compute-0 podman[222210]: 2025-10-01 14:19:22.166831986 +0000 UTC m=+0.081040357 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team)
Oct 01 14:19:22 compute-0 podman[222211]: 2025-10-01 14:19:22.168613234 +0000 UTC m=+0.076411921 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Oct 01 14:19:22 compute-0 nova_compute[192698]: 2025-10-01 14:19:22.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:19:22 compute-0 nova_compute[192698]: 2025-10-01 14:19:22.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:19:23 compute-0 nova_compute[192698]: 2025-10-01 14:19:23.916 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:19:25 compute-0 nova_compute[192698]: 2025-10-01 14:19:25.693 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:19:25 compute-0 nova_compute[192698]: 2025-10-01 14:19:25.914 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:19:26 compute-0 nova_compute[192698]: 2025-10-01 14:19:26.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:19:27 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:19:27.472 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=10cf9814-09fa-4bad-879a-270f9b64eda3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:19:28 compute-0 podman[222248]: 2025-10-01 14:19:28.169797234 +0000 UTC m=+0.084800958 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 01 14:19:28 compute-0 nova_compute[192698]: 2025-10-01 14:19:28.918 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:19:28 compute-0 nova_compute[192698]: 2025-10-01 14:19:28.952 2 DEBUG oslo_concurrency.lockutils [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Acquiring lock "02793c05-e4d6-429f-827a-83af4ed29eaf" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:19:28 compute-0 nova_compute[192698]: 2025-10-01 14:19:28.953 2 DEBUG oslo_concurrency.lockutils [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "02793c05-e4d6-429f-827a-83af4ed29eaf" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:19:29 compute-0 nova_compute[192698]: 2025-10-01 14:19:29.459 2 DEBUG nova.compute.manager [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Oct 01 14:19:29 compute-0 podman[203144]: time="2025-10-01T14:19:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:19:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:19:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:19:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:19:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3024 "" "Go-http-client/1.1"
Oct 01 14:19:29 compute-0 nova_compute[192698]: 2025-10-01 14:19:29.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:19:29 compute-0 nova_compute[192698]: 2025-10-01 14:19:29.926 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 01 14:19:30 compute-0 nova_compute[192698]: 2025-10-01 14:19:30.146 2 DEBUG oslo_concurrency.lockutils [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:19:30 compute-0 nova_compute[192698]: 2025-10-01 14:19:30.147 2 DEBUG oslo_concurrency.lockutils [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:19:30 compute-0 nova_compute[192698]: 2025-10-01 14:19:30.155 2 DEBUG nova.virt.hardware [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Oct 01 14:19:30 compute-0 nova_compute[192698]: 2025-10-01 14:19:30.155 2 INFO nova.compute.claims [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] Claim successful on node compute-0.ctlplane.example.com
Oct 01 14:19:30 compute-0 nova_compute[192698]: 2025-10-01 14:19:30.696 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:19:31 compute-0 nova_compute[192698]: 2025-10-01 14:19:31.220 2 DEBUG nova.compute.provider_tree [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:19:31 compute-0 openstack_network_exporter[205307]: ERROR   14:19:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:19:31 compute-0 openstack_network_exporter[205307]: ERROR   14:19:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:19:31 compute-0 openstack_network_exporter[205307]: ERROR   14:19:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:19:31 compute-0 openstack_network_exporter[205307]: ERROR   14:19:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:19:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:19:31 compute-0 openstack_network_exporter[205307]: ERROR   14:19:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:19:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:19:31 compute-0 nova_compute[192698]: 2025-10-01 14:19:31.727 2 DEBUG nova.scheduler.client.report [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:19:32 compute-0 nova_compute[192698]: 2025-10-01 14:19:32.245 2 DEBUG oslo_concurrency.lockutils [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.098s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:19:32 compute-0 nova_compute[192698]: 2025-10-01 14:19:32.246 2 DEBUG nova.compute.manager [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Oct 01 14:19:32 compute-0 nova_compute[192698]: 2025-10-01 14:19:32.758 2 DEBUG nova.compute.manager [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Oct 01 14:19:32 compute-0 nova_compute[192698]: 2025-10-01 14:19:32.759 2 DEBUG nova.network.neutron [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Oct 01 14:19:32 compute-0 nova_compute[192698]: 2025-10-01 14:19:32.760 2 WARNING neutronclient.v2_0.client [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:19:32 compute-0 nova_compute[192698]: 2025-10-01 14:19:32.760 2 WARNING neutronclient.v2_0.client [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:19:33 compute-0 nova_compute[192698]: 2025-10-01 14:19:33.268 2 INFO nova.virt.libvirt.driver [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 01 14:19:33 compute-0 nova_compute[192698]: 2025-10-01 14:19:33.730 2 DEBUG nova.network.neutron [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] Successfully created port: a7d8619c-08fc-4631-ae5e-d12856c1a1e1 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Oct 01 14:19:33 compute-0 nova_compute[192698]: 2025-10-01 14:19:33.778 2 DEBUG nova.compute.manager [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Oct 01 14:19:33 compute-0 nova_compute[192698]: 2025-10-01 14:19:33.920 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:19:34 compute-0 nova_compute[192698]: 2025-10-01 14:19:34.612 2 DEBUG nova.network.neutron [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] Successfully updated port: a7d8619c-08fc-4631-ae5e-d12856c1a1e1 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Oct 01 14:19:34 compute-0 nova_compute[192698]: 2025-10-01 14:19:34.711 2 DEBUG nova.compute.manager [req-9afefb2c-feb6-46f9-b4b6-ae2df9bb0b63 req-a020fbb7-3441-4c05-9aae-4e8aec8f6946 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] Received event network-changed-a7d8619c-08fc-4631-ae5e-d12856c1a1e1 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:19:34 compute-0 nova_compute[192698]: 2025-10-01 14:19:34.711 2 DEBUG nova.compute.manager [req-9afefb2c-feb6-46f9-b4b6-ae2df9bb0b63 req-a020fbb7-3441-4c05-9aae-4e8aec8f6946 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] Refreshing instance network info cache due to event network-changed-a7d8619c-08fc-4631-ae5e-d12856c1a1e1. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Oct 01 14:19:34 compute-0 nova_compute[192698]: 2025-10-01 14:19:34.712 2 DEBUG oslo_concurrency.lockutils [req-9afefb2c-feb6-46f9-b4b6-ae2df9bb0b63 req-a020fbb7-3441-4c05-9aae-4e8aec8f6946 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "refresh_cache-02793c05-e4d6-429f-827a-83af4ed29eaf" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 01 14:19:34 compute-0 nova_compute[192698]: 2025-10-01 14:19:34.712 2 DEBUG oslo_concurrency.lockutils [req-9afefb2c-feb6-46f9-b4b6-ae2df9bb0b63 req-a020fbb7-3441-4c05-9aae-4e8aec8f6946 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquired lock "refresh_cache-02793c05-e4d6-429f-827a-83af4ed29eaf" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 01 14:19:34 compute-0 nova_compute[192698]: 2025-10-01 14:19:34.712 2 DEBUG nova.network.neutron [req-9afefb2c-feb6-46f9-b4b6-ae2df9bb0b63 req-a020fbb7-3441-4c05-9aae-4e8aec8f6946 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] Refreshing network info cache for port a7d8619c-08fc-4631-ae5e-d12856c1a1e1 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Oct 01 14:19:34 compute-0 nova_compute[192698]: 2025-10-01 14:19:34.801 2 DEBUG nova.compute.manager [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Oct 01 14:19:34 compute-0 nova_compute[192698]: 2025-10-01 14:19:34.802 2 DEBUG nova.virt.libvirt.driver [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Oct 01 14:19:34 compute-0 nova_compute[192698]: 2025-10-01 14:19:34.803 2 INFO nova.virt.libvirt.driver [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] Creating image(s)
Oct 01 14:19:34 compute-0 nova_compute[192698]: 2025-10-01 14:19:34.804 2 DEBUG oslo_concurrency.lockutils [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Acquiring lock "/var/lib/nova/instances/02793c05-e4d6-429f-827a-83af4ed29eaf/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:19:34 compute-0 nova_compute[192698]: 2025-10-01 14:19:34.804 2 DEBUG oslo_concurrency.lockutils [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "/var/lib/nova/instances/02793c05-e4d6-429f-827a-83af4ed29eaf/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:19:34 compute-0 nova_compute[192698]: 2025-10-01 14:19:34.806 2 DEBUG oslo_concurrency.lockutils [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "/var/lib/nova/instances/02793c05-e4d6-429f-827a-83af4ed29eaf/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:19:34 compute-0 nova_compute[192698]: 2025-10-01 14:19:34.806 2 DEBUG oslo_utils.imageutils.format_inspector [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:19:34 compute-0 nova_compute[192698]: 2025-10-01 14:19:34.811 2 DEBUG oslo_utils.imageutils.format_inspector [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:19:34 compute-0 nova_compute[192698]: 2025-10-01 14:19:34.812 2 DEBUG oslo_concurrency.processutils [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:19:34 compute-0 nova_compute[192698]: 2025-10-01 14:19:34.899 2 DEBUG oslo_concurrency.processutils [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:19:34 compute-0 nova_compute[192698]: 2025-10-01 14:19:34.900 2 DEBUG oslo_concurrency.lockutils [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Acquiring lock "f477473ce09fdc00484ca839f539813eb2fee546" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:19:34 compute-0 nova_compute[192698]: 2025-10-01 14:19:34.901 2 DEBUG oslo_concurrency.lockutils [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "f477473ce09fdc00484ca839f539813eb2fee546" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:19:34 compute-0 nova_compute[192698]: 2025-10-01 14:19:34.901 2 DEBUG oslo_utils.imageutils.format_inspector [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:19:34 compute-0 nova_compute[192698]: 2025-10-01 14:19:34.904 2 DEBUG oslo_utils.imageutils.format_inspector [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:19:34 compute-0 nova_compute[192698]: 2025-10-01 14:19:34.905 2 DEBUG oslo_concurrency.processutils [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:19:34 compute-0 nova_compute[192698]: 2025-10-01 14:19:34.970 2 DEBUG oslo_concurrency.processutils [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:19:34 compute-0 nova_compute[192698]: 2025-10-01 14:19:34.971 2 DEBUG oslo_concurrency.processutils [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546,backing_fmt=raw /var/lib/nova/instances/02793c05-e4d6-429f-827a-83af4ed29eaf/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:19:35 compute-0 nova_compute[192698]: 2025-10-01 14:19:35.010 2 DEBUG oslo_concurrency.processutils [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546,backing_fmt=raw /var/lib/nova/instances/02793c05-e4d6-429f-827a-83af4ed29eaf/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:19:35 compute-0 nova_compute[192698]: 2025-10-01 14:19:35.012 2 DEBUG oslo_concurrency.lockutils [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "f477473ce09fdc00484ca839f539813eb2fee546" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.111s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:19:35 compute-0 nova_compute[192698]: 2025-10-01 14:19:35.012 2 DEBUG oslo_concurrency.processutils [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:19:35 compute-0 nova_compute[192698]: 2025-10-01 14:19:35.099 2 DEBUG oslo_concurrency.processutils [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:19:35 compute-0 nova_compute[192698]: 2025-10-01 14:19:35.100 2 DEBUG nova.virt.disk.api [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Checking if we can resize image /var/lib/nova/instances/02793c05-e4d6-429f-827a-83af4ed29eaf/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 01 14:19:35 compute-0 nova_compute[192698]: 2025-10-01 14:19:35.100 2 DEBUG oslo_concurrency.processutils [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/02793c05-e4d6-429f-827a-83af4ed29eaf/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:19:35 compute-0 nova_compute[192698]: 2025-10-01 14:19:35.121 2 DEBUG oslo_concurrency.lockutils [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Acquiring lock "refresh_cache-02793c05-e4d6-429f-827a-83af4ed29eaf" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 01 14:19:35 compute-0 nova_compute[192698]: 2025-10-01 14:19:35.158 2 DEBUG oslo_concurrency.processutils [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/02793c05-e4d6-429f-827a-83af4ed29eaf/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:19:35 compute-0 nova_compute[192698]: 2025-10-01 14:19:35.158 2 DEBUG nova.virt.disk.api [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Cannot resize image /var/lib/nova/instances/02793c05-e4d6-429f-827a-83af4ed29eaf/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 01 14:19:35 compute-0 nova_compute[192698]: 2025-10-01 14:19:35.159 2 DEBUG nova.virt.libvirt.driver [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Oct 01 14:19:35 compute-0 nova_compute[192698]: 2025-10-01 14:19:35.159 2 DEBUG nova.virt.libvirt.driver [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] Ensure instance console log exists: /var/lib/nova/instances/02793c05-e4d6-429f-827a-83af4ed29eaf/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Oct 01 14:19:35 compute-0 nova_compute[192698]: 2025-10-01 14:19:35.159 2 DEBUG oslo_concurrency.lockutils [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:19:35 compute-0 nova_compute[192698]: 2025-10-01 14:19:35.160 2 DEBUG oslo_concurrency.lockutils [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:19:35 compute-0 nova_compute[192698]: 2025-10-01 14:19:35.160 2 DEBUG oslo_concurrency.lockutils [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:19:35 compute-0 nova_compute[192698]: 2025-10-01 14:19:35.217 2 WARNING neutronclient.v2_0.client [req-9afefb2c-feb6-46f9-b4b6-ae2df9bb0b63 req-a020fbb7-3441-4c05-9aae-4e8aec8f6946 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:19:35 compute-0 nova_compute[192698]: 2025-10-01 14:19:35.698 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:19:35 compute-0 nova_compute[192698]: 2025-10-01 14:19:35.933 2 DEBUG nova.network.neutron [req-9afefb2c-feb6-46f9-b4b6-ae2df9bb0b63 req-a020fbb7-3441-4c05-9aae-4e8aec8f6946 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 01 14:19:36 compute-0 nova_compute[192698]: 2025-10-01 14:19:36.194 2 DEBUG nova.network.neutron [req-9afefb2c-feb6-46f9-b4b6-ae2df9bb0b63 req-a020fbb7-3441-4c05-9aae-4e8aec8f6946 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:19:36 compute-0 nova_compute[192698]: 2025-10-01 14:19:36.701 2 DEBUG oslo_concurrency.lockutils [req-9afefb2c-feb6-46f9-b4b6-ae2df9bb0b63 req-a020fbb7-3441-4c05-9aae-4e8aec8f6946 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Releasing lock "refresh_cache-02793c05-e4d6-429f-827a-83af4ed29eaf" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 01 14:19:36 compute-0 nova_compute[192698]: 2025-10-01 14:19:36.702 2 DEBUG oslo_concurrency.lockutils [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Acquired lock "refresh_cache-02793c05-e4d6-429f-827a-83af4ed29eaf" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 01 14:19:36 compute-0 nova_compute[192698]: 2025-10-01 14:19:36.703 2 DEBUG nova.network.neutron [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 01 14:19:37 compute-0 nova_compute[192698]: 2025-10-01 14:19:37.329 2 DEBUG nova.network.neutron [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 01 14:19:37 compute-0 nova_compute[192698]: 2025-10-01 14:19:37.533 2 WARNING neutronclient.v2_0.client [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:19:37 compute-0 nova_compute[192698]: 2025-10-01 14:19:37.673 2 DEBUG nova.network.neutron [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] Updating instance_info_cache with network_info: [{"id": "a7d8619c-08fc-4631-ae5e-d12856c1a1e1", "address": "fa:16:3e:08:53:17", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa7d8619c-08", "ovs_interfaceid": "a7d8619c-08fc-4631-ae5e-d12856c1a1e1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:19:38 compute-0 nova_compute[192698]: 2025-10-01 14:19:38.179 2 DEBUG oslo_concurrency.lockutils [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Releasing lock "refresh_cache-02793c05-e4d6-429f-827a-83af4ed29eaf" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 01 14:19:38 compute-0 nova_compute[192698]: 2025-10-01 14:19:38.180 2 DEBUG nova.compute.manager [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] Instance network_info: |[{"id": "a7d8619c-08fc-4631-ae5e-d12856c1a1e1", "address": "fa:16:3e:08:53:17", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa7d8619c-08", "ovs_interfaceid": "a7d8619c-08fc-4631-ae5e-d12856c1a1e1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Oct 01 14:19:38 compute-0 nova_compute[192698]: 2025-10-01 14:19:38.184 2 DEBUG nova.virt.libvirt.driver [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] Start _get_guest_xml network_info=[{"id": "a7d8619c-08fc-4631-ae5e-d12856c1a1e1", "address": "fa:16:3e:08:53:17", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa7d8619c-08", "ovs_interfaceid": "a7d8619c-08fc-4631-ae5e-d12856c1a1e1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-01T13:57:39Z,direct_url=<?>,disk_format='qcow2',id=48696e9b-a20d-4bf6-8ac2-6438fe748ab6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9dacac6049d34f02846f752af09ae16f',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-01T13:57:40Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encrypted': False, 'encryption_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_options': None, 'device_name': '/dev/vda', 'guest_format': None, 'image_id': '48696e9b-a20d-4bf6-8ac2-6438fe748ab6'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Oct 01 14:19:38 compute-0 nova_compute[192698]: 2025-10-01 14:19:38.188 2 WARNING nova.virt.libvirt.driver [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:19:38 compute-0 nova_compute[192698]: 2025-10-01 14:19:38.190 2 DEBUG nova.virt.driver [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='48696e9b-a20d-4bf6-8ac2-6438fe748ab6', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteStrategies-server-462029158', uuid='02793c05-e4d6-429f-827a-83af4ed29eaf'), owner=OwnerMeta(userid='f8897741e6ca4770b56d28d05fa3fc42', username='tempest-TestExecuteStrategies-30131345-project-admin', projectid='d43115e3729442e1b68b749acc0dabc8', projectname='tempest-TestExecuteStrategies-30131345'), image=ImageMeta(id='48696e9b-a20d-4bf6-8ac2-6438fe748ab6', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='69702c4b-38f2-49d1-96d5-85671652c67e', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "a7d8619c-08fc-4631-ae5e-d12856c1a1e1", "address": "fa:16:3e:08:53:17", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa7d8619c-08", "ovs_interfaceid": "a7d8619c-08fc-4631-ae5e-d12856c1a1e1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759328378.1904454) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Oct 01 14:19:38 compute-0 nova_compute[192698]: 2025-10-01 14:19:38.196 2 DEBUG nova.virt.libvirt.host [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Oct 01 14:19:38 compute-0 nova_compute[192698]: 2025-10-01 14:19:38.196 2 DEBUG nova.virt.libvirt.host [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Oct 01 14:19:38 compute-0 nova_compute[192698]: 2025-10-01 14:19:38.199 2 DEBUG nova.virt.libvirt.host [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Oct 01 14:19:38 compute-0 nova_compute[192698]: 2025-10-01 14:19:38.201 2 DEBUG nova.virt.libvirt.host [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Oct 01 14:19:38 compute-0 nova_compute[192698]: 2025-10-01 14:19:38.201 2 DEBUG nova.virt.libvirt.driver [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Oct 01 14:19:38 compute-0 nova_compute[192698]: 2025-10-01 14:19:38.201 2 DEBUG nova.virt.hardware [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-01T13:57:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='69702c4b-38f2-49d1-96d5-85671652c67e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-01T13:57:39Z,direct_url=<?>,disk_format='qcow2',id=48696e9b-a20d-4bf6-8ac2-6438fe748ab6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9dacac6049d34f02846f752af09ae16f',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-01T13:57:40Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Oct 01 14:19:38 compute-0 nova_compute[192698]: 2025-10-01 14:19:38.202 2 DEBUG nova.virt.hardware [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Oct 01 14:19:38 compute-0 nova_compute[192698]: 2025-10-01 14:19:38.202 2 DEBUG nova.virt.hardware [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Oct 01 14:19:38 compute-0 nova_compute[192698]: 2025-10-01 14:19:38.202 2 DEBUG nova.virt.hardware [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Oct 01 14:19:38 compute-0 nova_compute[192698]: 2025-10-01 14:19:38.202 2 DEBUG nova.virt.hardware [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Oct 01 14:19:38 compute-0 nova_compute[192698]: 2025-10-01 14:19:38.202 2 DEBUG nova.virt.hardware [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Oct 01 14:19:38 compute-0 nova_compute[192698]: 2025-10-01 14:19:38.202 2 DEBUG nova.virt.hardware [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Oct 01 14:19:38 compute-0 nova_compute[192698]: 2025-10-01 14:19:38.202 2 DEBUG nova.virt.hardware [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Oct 01 14:19:38 compute-0 nova_compute[192698]: 2025-10-01 14:19:38.202 2 DEBUG nova.virt.hardware [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Oct 01 14:19:38 compute-0 nova_compute[192698]: 2025-10-01 14:19:38.203 2 DEBUG nova.virt.hardware [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Oct 01 14:19:38 compute-0 nova_compute[192698]: 2025-10-01 14:19:38.203 2 DEBUG nova.virt.hardware [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Oct 01 14:19:38 compute-0 nova_compute[192698]: 2025-10-01 14:19:38.205 2 DEBUG nova.virt.libvirt.vif [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-01T14:19:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-462029158',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-462029158',id=19,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d43115e3729442e1b68b749acc0dabc8',ramdisk_id='',reservation_id='r-n7j4s4mz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-30131345',owner_user_name='tempest-TestExecuteStrategies-30131345-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-01T14:19:33Z,user_data=None,user_id='f8897741e6ca4770b56d28d05fa3fc42',uuid=02793c05-e4d6-429f-827a-83af4ed29eaf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a7d8619c-08fc-4631-ae5e-d12856c1a1e1", "address": "fa:16:3e:08:53:17", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa7d8619c-08", "ovs_interfaceid": "a7d8619c-08fc-4631-ae5e-d12856c1a1e1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Oct 01 14:19:38 compute-0 nova_compute[192698]: 2025-10-01 14:19:38.206 2 DEBUG nova.network.os_vif_util [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Converting VIF {"id": "a7d8619c-08fc-4631-ae5e-d12856c1a1e1", "address": "fa:16:3e:08:53:17", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa7d8619c-08", "ovs_interfaceid": "a7d8619c-08fc-4631-ae5e-d12856c1a1e1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:19:38 compute-0 nova_compute[192698]: 2025-10-01 14:19:38.206 2 DEBUG nova.network.os_vif_util [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:08:53:17,bridge_name='br-int',has_traffic_filtering=True,id=a7d8619c-08fc-4631-ae5e-d12856c1a1e1,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa7d8619c-08') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:19:38 compute-0 nova_compute[192698]: 2025-10-01 14:19:38.207 2 DEBUG nova.objects.instance [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lazy-loading 'pci_devices' on Instance uuid 02793c05-e4d6-429f-827a-83af4ed29eaf obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:19:38 compute-0 nova_compute[192698]: 2025-10-01 14:19:38.718 2 DEBUG nova.virt.libvirt.driver [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] End _get_guest_xml xml=<domain type="kvm">
Oct 01 14:19:38 compute-0 nova_compute[192698]:   <uuid>02793c05-e4d6-429f-827a-83af4ed29eaf</uuid>
Oct 01 14:19:38 compute-0 nova_compute[192698]:   <name>instance-00000013</name>
Oct 01 14:19:38 compute-0 nova_compute[192698]:   <memory>131072</memory>
Oct 01 14:19:38 compute-0 nova_compute[192698]:   <vcpu>1</vcpu>
Oct 01 14:19:38 compute-0 nova_compute[192698]:   <metadata>
Oct 01 14:19:38 compute-0 nova_compute[192698]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 01 14:19:38 compute-0 nova_compute[192698]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Oct 01 14:19:38 compute-0 nova_compute[192698]:       <nova:name>tempest-TestExecuteStrategies-server-462029158</nova:name>
Oct 01 14:19:38 compute-0 nova_compute[192698]:       <nova:creationTime>2025-10-01 14:19:38</nova:creationTime>
Oct 01 14:19:38 compute-0 nova_compute[192698]:       <nova:flavor name="m1.nano" id="69702c4b-38f2-49d1-96d5-85671652c67e">
Oct 01 14:19:38 compute-0 nova_compute[192698]:         <nova:memory>128</nova:memory>
Oct 01 14:19:38 compute-0 nova_compute[192698]:         <nova:disk>1</nova:disk>
Oct 01 14:19:38 compute-0 nova_compute[192698]:         <nova:swap>0</nova:swap>
Oct 01 14:19:38 compute-0 nova_compute[192698]:         <nova:ephemeral>0</nova:ephemeral>
Oct 01 14:19:38 compute-0 nova_compute[192698]:         <nova:vcpus>1</nova:vcpus>
Oct 01 14:19:38 compute-0 nova_compute[192698]:         <nova:extraSpecs>
Oct 01 14:19:38 compute-0 nova_compute[192698]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Oct 01 14:19:38 compute-0 nova_compute[192698]:         </nova:extraSpecs>
Oct 01 14:19:38 compute-0 nova_compute[192698]:       </nova:flavor>
Oct 01 14:19:38 compute-0 nova_compute[192698]:       <nova:image uuid="48696e9b-a20d-4bf6-8ac2-6438fe748ab6">
Oct 01 14:19:38 compute-0 nova_compute[192698]:         <nova:containerFormat>bare</nova:containerFormat>
Oct 01 14:19:38 compute-0 nova_compute[192698]:         <nova:diskFormat>qcow2</nova:diskFormat>
Oct 01 14:19:38 compute-0 nova_compute[192698]:         <nova:minDisk>1</nova:minDisk>
Oct 01 14:19:38 compute-0 nova_compute[192698]:         <nova:minRam>0</nova:minRam>
Oct 01 14:19:38 compute-0 nova_compute[192698]:         <nova:properties>
Oct 01 14:19:38 compute-0 nova_compute[192698]:           <nova:property name="hw_rng_model">virtio</nova:property>
Oct 01 14:19:38 compute-0 nova_compute[192698]:         </nova:properties>
Oct 01 14:19:38 compute-0 nova_compute[192698]:       </nova:image>
Oct 01 14:19:38 compute-0 nova_compute[192698]:       <nova:owner>
Oct 01 14:19:38 compute-0 nova_compute[192698]:         <nova:user uuid="f8897741e6ca4770b56d28d05fa3fc42">tempest-TestExecuteStrategies-30131345-project-admin</nova:user>
Oct 01 14:19:38 compute-0 nova_compute[192698]:         <nova:project uuid="d43115e3729442e1b68b749acc0dabc8">tempest-TestExecuteStrategies-30131345</nova:project>
Oct 01 14:19:38 compute-0 nova_compute[192698]:       </nova:owner>
Oct 01 14:19:38 compute-0 nova_compute[192698]:       <nova:root type="image" uuid="48696e9b-a20d-4bf6-8ac2-6438fe748ab6"/>
Oct 01 14:19:38 compute-0 nova_compute[192698]:       <nova:ports>
Oct 01 14:19:38 compute-0 nova_compute[192698]:         <nova:port uuid="a7d8619c-08fc-4631-ae5e-d12856c1a1e1">
Oct 01 14:19:38 compute-0 nova_compute[192698]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Oct 01 14:19:38 compute-0 nova_compute[192698]:         </nova:port>
Oct 01 14:19:38 compute-0 nova_compute[192698]:       </nova:ports>
Oct 01 14:19:38 compute-0 nova_compute[192698]:     </nova:instance>
Oct 01 14:19:38 compute-0 nova_compute[192698]:   </metadata>
Oct 01 14:19:38 compute-0 nova_compute[192698]:   <sysinfo type="smbios">
Oct 01 14:19:38 compute-0 nova_compute[192698]:     <system>
Oct 01 14:19:38 compute-0 nova_compute[192698]:       <entry name="manufacturer">RDO</entry>
Oct 01 14:19:38 compute-0 nova_compute[192698]:       <entry name="product">OpenStack Compute</entry>
Oct 01 14:19:38 compute-0 nova_compute[192698]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Oct 01 14:19:38 compute-0 nova_compute[192698]:       <entry name="serial">02793c05-e4d6-429f-827a-83af4ed29eaf</entry>
Oct 01 14:19:38 compute-0 nova_compute[192698]:       <entry name="uuid">02793c05-e4d6-429f-827a-83af4ed29eaf</entry>
Oct 01 14:19:38 compute-0 nova_compute[192698]:       <entry name="family">Virtual Machine</entry>
Oct 01 14:19:38 compute-0 nova_compute[192698]:     </system>
Oct 01 14:19:38 compute-0 nova_compute[192698]:   </sysinfo>
Oct 01 14:19:38 compute-0 nova_compute[192698]:   <os>
Oct 01 14:19:38 compute-0 nova_compute[192698]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 01 14:19:38 compute-0 nova_compute[192698]:     <boot dev="hd"/>
Oct 01 14:19:38 compute-0 nova_compute[192698]:     <smbios mode="sysinfo"/>
Oct 01 14:19:38 compute-0 nova_compute[192698]:   </os>
Oct 01 14:19:38 compute-0 nova_compute[192698]:   <features>
Oct 01 14:19:38 compute-0 nova_compute[192698]:     <acpi/>
Oct 01 14:19:38 compute-0 nova_compute[192698]:     <apic/>
Oct 01 14:19:38 compute-0 nova_compute[192698]:     <vmcoreinfo/>
Oct 01 14:19:38 compute-0 nova_compute[192698]:   </features>
Oct 01 14:19:38 compute-0 nova_compute[192698]:   <clock offset="utc">
Oct 01 14:19:38 compute-0 nova_compute[192698]:     <timer name="pit" tickpolicy="delay"/>
Oct 01 14:19:38 compute-0 nova_compute[192698]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 01 14:19:38 compute-0 nova_compute[192698]:     <timer name="hpet" present="no"/>
Oct 01 14:19:38 compute-0 nova_compute[192698]:   </clock>
Oct 01 14:19:38 compute-0 nova_compute[192698]:   <cpu mode="host-model" match="exact">
Oct 01 14:19:38 compute-0 nova_compute[192698]:     <topology sockets="1" cores="1" threads="1"/>
Oct 01 14:19:38 compute-0 nova_compute[192698]:   </cpu>
Oct 01 14:19:38 compute-0 nova_compute[192698]:   <devices>
Oct 01 14:19:38 compute-0 nova_compute[192698]:     <disk type="file" device="disk">
Oct 01 14:19:38 compute-0 nova_compute[192698]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 01 14:19:38 compute-0 nova_compute[192698]:       <source file="/var/lib/nova/instances/02793c05-e4d6-429f-827a-83af4ed29eaf/disk"/>
Oct 01 14:19:38 compute-0 nova_compute[192698]:       <target dev="vda" bus="virtio"/>
Oct 01 14:19:38 compute-0 nova_compute[192698]:     </disk>
Oct 01 14:19:38 compute-0 nova_compute[192698]:     <disk type="file" device="cdrom">
Oct 01 14:19:38 compute-0 nova_compute[192698]:       <driver name="qemu" type="raw" cache="none"/>
Oct 01 14:19:38 compute-0 nova_compute[192698]:       <source file="/var/lib/nova/instances/02793c05-e4d6-429f-827a-83af4ed29eaf/disk.config"/>
Oct 01 14:19:38 compute-0 nova_compute[192698]:       <target dev="sda" bus="sata"/>
Oct 01 14:19:38 compute-0 nova_compute[192698]:     </disk>
Oct 01 14:19:38 compute-0 nova_compute[192698]:     <interface type="ethernet">
Oct 01 14:19:38 compute-0 nova_compute[192698]:       <mac address="fa:16:3e:08:53:17"/>
Oct 01 14:19:38 compute-0 nova_compute[192698]:       <model type="virtio"/>
Oct 01 14:19:38 compute-0 nova_compute[192698]:       <driver name="vhost" rx_queue_size="512"/>
Oct 01 14:19:38 compute-0 nova_compute[192698]:       <mtu size="1442"/>
Oct 01 14:19:38 compute-0 nova_compute[192698]:       <target dev="tapa7d8619c-08"/>
Oct 01 14:19:38 compute-0 nova_compute[192698]:     </interface>
Oct 01 14:19:38 compute-0 nova_compute[192698]:     <serial type="pty">
Oct 01 14:19:38 compute-0 nova_compute[192698]:       <log file="/var/lib/nova/instances/02793c05-e4d6-429f-827a-83af4ed29eaf/console.log" append="off"/>
Oct 01 14:19:38 compute-0 nova_compute[192698]:     </serial>
Oct 01 14:19:38 compute-0 nova_compute[192698]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 01 14:19:38 compute-0 nova_compute[192698]:     <video>
Oct 01 14:19:38 compute-0 nova_compute[192698]:       <model type="virtio"/>
Oct 01 14:19:38 compute-0 nova_compute[192698]:     </video>
Oct 01 14:19:38 compute-0 nova_compute[192698]:     <input type="tablet" bus="usb"/>
Oct 01 14:19:38 compute-0 nova_compute[192698]:     <rng model="virtio">
Oct 01 14:19:38 compute-0 nova_compute[192698]:       <backend model="random">/dev/urandom</backend>
Oct 01 14:19:38 compute-0 nova_compute[192698]:     </rng>
Oct 01 14:19:38 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root"/>
Oct 01 14:19:38 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:19:38 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:19:38 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:19:38 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:19:38 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:19:38 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:19:38 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:19:38 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:19:38 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:19:38 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:19:38 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:19:38 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:19:38 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:19:38 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:19:38 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:19:38 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:19:38 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:19:38 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:19:38 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:19:38 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:19:38 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:19:38 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:19:38 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:19:38 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:19:38 compute-0 nova_compute[192698]:     <controller type="usb" index="0"/>
Oct 01 14:19:38 compute-0 nova_compute[192698]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Oct 01 14:19:38 compute-0 nova_compute[192698]:       <stats period="10"/>
Oct 01 14:19:38 compute-0 nova_compute[192698]:     </memballoon>
Oct 01 14:19:38 compute-0 nova_compute[192698]:   </devices>
Oct 01 14:19:38 compute-0 nova_compute[192698]: </domain>
Oct 01 14:19:38 compute-0 nova_compute[192698]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Oct 01 14:19:38 compute-0 nova_compute[192698]: 2025-10-01 14:19:38.720 2 DEBUG nova.compute.manager [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] Preparing to wait for external event network-vif-plugged-a7d8619c-08fc-4631-ae5e-d12856c1a1e1 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Oct 01 14:19:38 compute-0 nova_compute[192698]: 2025-10-01 14:19:38.721 2 DEBUG oslo_concurrency.lockutils [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Acquiring lock "02793c05-e4d6-429f-827a-83af4ed29eaf-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:19:38 compute-0 nova_compute[192698]: 2025-10-01 14:19:38.721 2 DEBUG oslo_concurrency.lockutils [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "02793c05-e4d6-429f-827a-83af4ed29eaf-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:19:38 compute-0 nova_compute[192698]: 2025-10-01 14:19:38.722 2 DEBUG oslo_concurrency.lockutils [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "02793c05-e4d6-429f-827a-83af4ed29eaf-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:19:38 compute-0 nova_compute[192698]: 2025-10-01 14:19:38.723 2 DEBUG nova.virt.libvirt.vif [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-01T14:19:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-462029158',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-462029158',id=19,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d43115e3729442e1b68b749acc0dabc8',ramdisk_id='',reservation_id='r-n7j4s4mz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-30131345',owner_user_name='tempest-TestExecuteStrategies-30131345-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-01T14:19:33Z,user_data=None,user_id='f8897741e6ca4770b56d28d05fa3fc42',uuid=02793c05-e4d6-429f-827a-83af4ed29eaf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a7d8619c-08fc-4631-ae5e-d12856c1a1e1", "address": "fa:16:3e:08:53:17", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa7d8619c-08", "ovs_interfaceid": "a7d8619c-08fc-4631-ae5e-d12856c1a1e1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 01 14:19:38 compute-0 nova_compute[192698]: 2025-10-01 14:19:38.724 2 DEBUG nova.network.os_vif_util [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Converting VIF {"id": "a7d8619c-08fc-4631-ae5e-d12856c1a1e1", "address": "fa:16:3e:08:53:17", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa7d8619c-08", "ovs_interfaceid": "a7d8619c-08fc-4631-ae5e-d12856c1a1e1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:19:38 compute-0 nova_compute[192698]: 2025-10-01 14:19:38.725 2 DEBUG nova.network.os_vif_util [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:08:53:17,bridge_name='br-int',has_traffic_filtering=True,id=a7d8619c-08fc-4631-ae5e-d12856c1a1e1,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa7d8619c-08') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:19:38 compute-0 nova_compute[192698]: 2025-10-01 14:19:38.726 2 DEBUG os_vif [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:08:53:17,bridge_name='br-int',has_traffic_filtering=True,id=a7d8619c-08fc-4631-ae5e-d12856c1a1e1,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa7d8619c-08') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 01 14:19:38 compute-0 nova_compute[192698]: 2025-10-01 14:19:38.727 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:19:38 compute-0 nova_compute[192698]: 2025-10-01 14:19:38.727 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:19:38 compute-0 nova_compute[192698]: 2025-10-01 14:19:38.728 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:19:38 compute-0 nova_compute[192698]: 2025-10-01 14:19:38.730 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:19:38 compute-0 nova_compute[192698]: 2025-10-01 14:19:38.730 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'a6f9a420-c917-58be-8217-cd740db9d516', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:19:38 compute-0 nova_compute[192698]: 2025-10-01 14:19:38.733 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:19:38 compute-0 nova_compute[192698]: 2025-10-01 14:19:38.736 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:19:38 compute-0 nova_compute[192698]: 2025-10-01 14:19:38.743 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:19:38 compute-0 nova_compute[192698]: 2025-10-01 14:19:38.743 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa7d8619c-08, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:19:38 compute-0 nova_compute[192698]: 2025-10-01 14:19:38.744 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapa7d8619c-08, col_values=(('qos', UUID('84dc1b24-55df-4824-8291-77560c76a3fe')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:19:38 compute-0 nova_compute[192698]: 2025-10-01 14:19:38.744 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapa7d8619c-08, col_values=(('external_ids', {'iface-id': 'a7d8619c-08fc-4631-ae5e-d12856c1a1e1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:08:53:17', 'vm-uuid': '02793c05-e4d6-429f-827a-83af4ed29eaf'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:19:38 compute-0 nova_compute[192698]: 2025-10-01 14:19:38.746 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:19:38 compute-0 NetworkManager[51741]: <info>  [1759328378.7477] manager: (tapa7d8619c-08): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/58)
Oct 01 14:19:38 compute-0 nova_compute[192698]: 2025-10-01 14:19:38.748 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:19:38 compute-0 nova_compute[192698]: 2025-10-01 14:19:38.756 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:19:38 compute-0 nova_compute[192698]: 2025-10-01 14:19:38.757 2 INFO os_vif [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:08:53:17,bridge_name='br-int',has_traffic_filtering=True,id=a7d8619c-08fc-4631-ae5e-d12856c1a1e1,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa7d8619c-08')
Oct 01 14:19:38 compute-0 nova_compute[192698]: 2025-10-01 14:19:38.923 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:19:40 compute-0 podman[222290]: 2025-10-01 14:19:40.158265696 +0000 UTC m=+0.067997843 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible)
Oct 01 14:19:40 compute-0 podman[222291]: 2025-10-01 14:19:40.18833436 +0000 UTC m=+0.101104350 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, managed_by=edpm_ansible)
Oct 01 14:19:40 compute-0 nova_compute[192698]: 2025-10-01 14:19:40.328 2 DEBUG nova.virt.libvirt.driver [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 01 14:19:40 compute-0 nova_compute[192698]: 2025-10-01 14:19:40.329 2 DEBUG nova.virt.libvirt.driver [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 01 14:19:40 compute-0 nova_compute[192698]: 2025-10-01 14:19:40.329 2 DEBUG nova.virt.libvirt.driver [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] No VIF found with MAC fa:16:3e:08:53:17, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Oct 01 14:19:40 compute-0 nova_compute[192698]: 2025-10-01 14:19:40.330 2 INFO nova.virt.libvirt.driver [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] Using config drive
Oct 01 14:19:40 compute-0 nova_compute[192698]: 2025-10-01 14:19:40.843 2 WARNING neutronclient.v2_0.client [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:19:41 compute-0 nova_compute[192698]: 2025-10-01 14:19:41.017 2 INFO nova.virt.libvirt.driver [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] Creating config drive at /var/lib/nova/instances/02793c05-e4d6-429f-827a-83af4ed29eaf/disk.config
Oct 01 14:19:41 compute-0 nova_compute[192698]: 2025-10-01 14:19:41.027 2 DEBUG oslo_concurrency.processutils [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/02793c05-e4d6-429f-827a-83af4ed29eaf/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpyrry84kk execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:19:41 compute-0 nova_compute[192698]: 2025-10-01 14:19:41.170 2 DEBUG oslo_concurrency.processutils [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/02793c05-e4d6-429f-827a-83af4ed29eaf/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpyrry84kk" returned: 0 in 0.143s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:19:41 compute-0 kernel: tapa7d8619c-08: entered promiscuous mode
Oct 01 14:19:41 compute-0 NetworkManager[51741]: <info>  [1759328381.2790] manager: (tapa7d8619c-08): new Tun device (/org/freedesktop/NetworkManager/Devices/59)
Oct 01 14:19:41 compute-0 ovn_controller[94909]: 2025-10-01T14:19:41Z|00144|binding|INFO|Claiming lport a7d8619c-08fc-4631-ae5e-d12856c1a1e1 for this chassis.
Oct 01 14:19:41 compute-0 ovn_controller[94909]: 2025-10-01T14:19:41Z|00145|binding|INFO|a7d8619c-08fc-4631-ae5e-d12856c1a1e1: Claiming fa:16:3e:08:53:17 10.100.0.5
Oct 01 14:19:41 compute-0 nova_compute[192698]: 2025-10-01 14:19:41.278 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:19:41.287 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:08:53:17 10.100.0.5'], port_security=['fa:16:3e:08:53:17 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '02793c05-e4d6-429f-827a-83af4ed29eaf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-031a8987-8430-4fb6-a464-01e4dca2fae7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd43115e3729442e1b68b749acc0dabc8', 'neutron:revision_number': '4', 'neutron:security_group_ids': '43a3232d-93b1-43af-a9a3-1fde49b4460d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bd1914da-f1b0-4097-9d6b-24a3870871dc, chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], logical_port=a7d8619c-08fc-4631-ae5e-d12856c1a1e1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:19:41.288 103791 INFO neutron.agent.ovn.metadata.agent [-] Port a7d8619c-08fc-4631-ae5e-d12856c1a1e1 in datapath 031a8987-8430-4fb6-a464-01e4dca2fae7 bound to our chassis
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:19:41.290 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 031a8987-8430-4fb6-a464-01e4dca2fae7
Oct 01 14:19:41 compute-0 ovn_controller[94909]: 2025-10-01T14:19:41Z|00146|binding|INFO|Setting lport a7d8619c-08fc-4631-ae5e-d12856c1a1e1 up in Southbound
Oct 01 14:19:41 compute-0 ovn_controller[94909]: 2025-10-01T14:19:41Z|00147|binding|INFO|Setting lport a7d8619c-08fc-4631-ae5e-d12856c1a1e1 ovn-installed in OVS
Oct 01 14:19:41 compute-0 nova_compute[192698]: 2025-10-01 14:19:41.307 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:19:41.309 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[df1002eb-2403-4bd2-9997-7d29860202aa]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:19:41.311 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap031a8987-81 in ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Oct 01 14:19:41 compute-0 nova_compute[192698]: 2025-10-01 14:19:41.312 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:19:41.315 214114 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap031a8987-80 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:19:41.315 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[ae79f9a5-d1fb-43b1-97c8-c6ef0b3426e4]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:19:41.317 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[56a3e573-9983-4652-9288-4ecff7bb21bb]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:19:41 compute-0 systemd-udevd[222352]: Network interface NamePolicy= disabled on kernel command line.
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:19:41.337 103910 DEBUG oslo.privsep.daemon [-] privsep: reply[dd173c22-5c88-4374-970b-8812050b5a40]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:19:41 compute-0 NetworkManager[51741]: <info>  [1759328381.3436] device (tapa7d8619c-08): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 01 14:19:41 compute-0 NetworkManager[51741]: <info>  [1759328381.3447] device (tapa7d8619c-08): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 01 14:19:41 compute-0 systemd-machined[152704]: New machine qemu-13-instance-00000013.
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:19:41.358 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[724ae67c-e386-46a2-afbf-60adade33c12]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:19:41 compute-0 systemd[1]: Started Virtual Machine qemu-13-instance-00000013.
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:19:41.407 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[e94d517c-8eed-4570-83af-881961e504c1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:19:41.412 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[593fc370-8f47-44df-ab8d-da4e688f8e0b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:19:41 compute-0 NetworkManager[51741]: <info>  [1759328381.4141] manager: (tap031a8987-80): new Veth device (/org/freedesktop/NetworkManager/Devices/60)
Oct 01 14:19:41 compute-0 nova_compute[192698]: 2025-10-01 14:19:41.459 2 DEBUG nova.compute.manager [req-3a8f21a8-2b47-4b85-8866-bc0c4825a703 req-5a14d67a-4130-4ec9-81bb-113a86995f6d 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] Received event network-vif-plugged-a7d8619c-08fc-4631-ae5e-d12856c1a1e1 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:19:41 compute-0 nova_compute[192698]: 2025-10-01 14:19:41.460 2 DEBUG oslo_concurrency.lockutils [req-3a8f21a8-2b47-4b85-8866-bc0c4825a703 req-5a14d67a-4130-4ec9-81bb-113a86995f6d 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "02793c05-e4d6-429f-827a-83af4ed29eaf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:19:41 compute-0 nova_compute[192698]: 2025-10-01 14:19:41.460 2 DEBUG oslo_concurrency.lockutils [req-3a8f21a8-2b47-4b85-8866-bc0c4825a703 req-5a14d67a-4130-4ec9-81bb-113a86995f6d 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "02793c05-e4d6-429f-827a-83af4ed29eaf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:19:41 compute-0 nova_compute[192698]: 2025-10-01 14:19:41.460 2 DEBUG oslo_concurrency.lockutils [req-3a8f21a8-2b47-4b85-8866-bc0c4825a703 req-5a14d67a-4130-4ec9-81bb-113a86995f6d 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "02793c05-e4d6-429f-827a-83af4ed29eaf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:19:41 compute-0 nova_compute[192698]: 2025-10-01 14:19:41.461 2 DEBUG nova.compute.manager [req-3a8f21a8-2b47-4b85-8866-bc0c4825a703 req-5a14d67a-4130-4ec9-81bb-113a86995f6d 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] Processing event network-vif-plugged-a7d8619c-08fc-4631-ae5e-d12856c1a1e1 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:19:41.463 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[1dd0a508-fb4a-49ce-bdc0-16c564316df3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:19:41.470 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[2ae8fca3-07a1-4e10-8125-789c2efb843c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:19:41 compute-0 NetworkManager[51741]: <info>  [1759328381.5124] device (tap031a8987-80): carrier: link connected
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:19:41.524 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[a06499ac-9acc-4630-b2b7-7adf4181dd7f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:19:41.550 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[e32e7ccc-0851-4a96-9c57-933dd65723ae]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap031a8987-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:79:6c:81'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 474204, 'reachable_time': 30003, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222387, 'error': None, 'target': 'ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:19:41.570 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[f6b8e10c-1956-4f5b-9069-04d4c78a63d6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe79:6c81'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 474204, 'tstamp': 474204}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222388, 'error': None, 'target': 'ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:19:41.593 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[db5d621f-2899-4e62-b7d6-39ccea3b6e00]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap031a8987-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:79:6c:81'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 474204, 'reachable_time': 30003, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222389, 'error': None, 'target': 'ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:19:41.633 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[278acfbc-6548-46ac-a45d-d22936ce3d86]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:19:41.708 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[a1f9c72d-c5fb-4cab-b920-3ffdf577a380]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:19:41.709 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap031a8987-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:19:41.710 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:19:41.710 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap031a8987-80, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:19:41 compute-0 kernel: tap031a8987-80: entered promiscuous mode
Oct 01 14:19:41 compute-0 NetworkManager[51741]: <info>  [1759328381.7130] manager: (tap031a8987-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/61)
Oct 01 14:19:41 compute-0 nova_compute[192698]: 2025-10-01 14:19:41.714 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:19:41.717 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap031a8987-80, col_values=(('external_ids', {'iface-id': '6dd814dc-cba2-4392-85ef-eadb8c4615f7'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:19:41 compute-0 nova_compute[192698]: 2025-10-01 14:19:41.717 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:19:41 compute-0 ovn_controller[94909]: 2025-10-01T14:19:41Z|00148|binding|INFO|Releasing lport 6dd814dc-cba2-4392-85ef-eadb8c4615f7 from this chassis (sb_readonly=0)
Oct 01 14:19:41 compute-0 nova_compute[192698]: 2025-10-01 14:19:41.720 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:19:41.721 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[b49b2be5-f201-4d83-8d33-4e3255627151]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:19:41.721 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:19:41.722 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:19:41.722 103791 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 031a8987-8430-4fb6-a464-01e4dca2fae7 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:19:41.722 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:19:41.723 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[7e580c35-f729-4c9e-9a62-b6243e978618]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:19:41.723 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:19:41.724 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[0e554d0a-8ff5-4aef-94a2-6a85b83d2c1e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:19:41.724 103791 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]: global
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]:     log         /dev/log local0 debug
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]:     log-tag     haproxy-metadata-proxy-031a8987-8430-4fb6-a464-01e4dca2fae7
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]:     user        root
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]:     group       root
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]:     maxconn     1024
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]:     pidfile     /var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]:     daemon
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]: 
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]: defaults
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]:     log global
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]:     mode http
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]:     option httplog
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]:     option dontlognull
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]:     option http-server-close
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]:     option forwardfor
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]:     retries                 3
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]:     timeout http-request    30s
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]:     timeout connect         30s
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]:     timeout client          32s
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]:     timeout server          32s
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]:     timeout http-keep-alive 30s
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]: 
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]: listen listener
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]:     bind 169.254.169.254:80
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]:     
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]:     server metadata /var/lib/neutron/metadata_proxy
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]: 
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]:     http-request add-header X-OVN-Network-ID 031a8987-8430-4fb6-a464-01e4dca2fae7
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Oct 01 14:19:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:19:41.725 103791 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7', 'env', 'PROCESS_TAG=haproxy-031a8987-8430-4fb6-a464-01e4dca2fae7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/031a8987-8430-4fb6-a464-01e4dca2fae7.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Oct 01 14:19:41 compute-0 nova_compute[192698]: 2025-10-01 14:19:41.734 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:19:42 compute-0 podman[222428]: 2025-10-01 14:19:42.192331781 +0000 UTC m=+0.060416318 container create 5e6ff6386b445c145f4fa09fd78b67ccbea946bf1856fdca0c305e737a2d24e8 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Oct 01 14:19:42 compute-0 systemd[1]: Started libpod-conmon-5e6ff6386b445c145f4fa09fd78b67ccbea946bf1856fdca0c305e737a2d24e8.scope.
Oct 01 14:19:42 compute-0 podman[222428]: 2025-10-01 14:19:42.162446331 +0000 UTC m=+0.030530888 image pull 0c139338a67144a0d88e07ef5f38b20d3085af4a1586fd8115d3776c8f9c633c 38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Oct 01 14:19:42 compute-0 systemd[1]: Started libcrun container.
Oct 01 14:19:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e36be47be6c5d6e98532476ff23ba4a4cd1efed9460afc15aacbc23c4b99fba5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 01 14:19:42 compute-0 podman[222428]: 2025-10-01 14:19:42.314431608 +0000 UTC m=+0.182516235 container init 5e6ff6386b445c145f4fa09fd78b67ccbea946bf1856fdca0c305e737a2d24e8 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 01 14:19:42 compute-0 podman[222428]: 2025-10-01 14:19:42.32115028 +0000 UTC m=+0.189234847 container start 5e6ff6386b445c145f4fa09fd78b67ccbea946bf1856fdca0c305e737a2d24e8 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Oct 01 14:19:42 compute-0 neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7[222444]: [NOTICE]   (222448) : New worker (222450) forked
Oct 01 14:19:42 compute-0 neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7[222444]: [NOTICE]   (222448) : Loading success.
Oct 01 14:19:42 compute-0 nova_compute[192698]: 2025-10-01 14:19:42.570 2 DEBUG nova.compute.manager [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Oct 01 14:19:42 compute-0 nova_compute[192698]: 2025-10-01 14:19:42.578 2 DEBUG nova.virt.libvirt.driver [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Oct 01 14:19:42 compute-0 nova_compute[192698]: 2025-10-01 14:19:42.583 2 INFO nova.virt.libvirt.driver [-] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] Instance spawned successfully.
Oct 01 14:19:42 compute-0 nova_compute[192698]: 2025-10-01 14:19:42.583 2 DEBUG nova.virt.libvirt.driver [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Oct 01 14:19:43 compute-0 nova_compute[192698]: 2025-10-01 14:19:43.099 2 DEBUG nova.virt.libvirt.driver [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:19:43 compute-0 nova_compute[192698]: 2025-10-01 14:19:43.100 2 DEBUG nova.virt.libvirt.driver [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:19:43 compute-0 nova_compute[192698]: 2025-10-01 14:19:43.101 2 DEBUG nova.virt.libvirt.driver [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:19:43 compute-0 nova_compute[192698]: 2025-10-01 14:19:43.101 2 DEBUG nova.virt.libvirt.driver [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:19:43 compute-0 nova_compute[192698]: 2025-10-01 14:19:43.102 2 DEBUG nova.virt.libvirt.driver [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:19:43 compute-0 nova_compute[192698]: 2025-10-01 14:19:43.103 2 DEBUG nova.virt.libvirt.driver [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:19:43 compute-0 nova_compute[192698]: 2025-10-01 14:19:43.522 2 DEBUG nova.compute.manager [req-28b2e2e5-fe63-426b-8b5b-9f738599c32f req-4ec981d2-3ad1-4b19-bb24-76f0ad1493ed 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] Received event network-vif-plugged-a7d8619c-08fc-4631-ae5e-d12856c1a1e1 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:19:43 compute-0 nova_compute[192698]: 2025-10-01 14:19:43.522 2 DEBUG oslo_concurrency.lockutils [req-28b2e2e5-fe63-426b-8b5b-9f738599c32f req-4ec981d2-3ad1-4b19-bb24-76f0ad1493ed 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "02793c05-e4d6-429f-827a-83af4ed29eaf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:19:43 compute-0 nova_compute[192698]: 2025-10-01 14:19:43.523 2 DEBUG oslo_concurrency.lockutils [req-28b2e2e5-fe63-426b-8b5b-9f738599c32f req-4ec981d2-3ad1-4b19-bb24-76f0ad1493ed 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "02793c05-e4d6-429f-827a-83af4ed29eaf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:19:43 compute-0 nova_compute[192698]: 2025-10-01 14:19:43.523 2 DEBUG oslo_concurrency.lockutils [req-28b2e2e5-fe63-426b-8b5b-9f738599c32f req-4ec981d2-3ad1-4b19-bb24-76f0ad1493ed 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "02793c05-e4d6-429f-827a-83af4ed29eaf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:19:43 compute-0 nova_compute[192698]: 2025-10-01 14:19:43.524 2 DEBUG nova.compute.manager [req-28b2e2e5-fe63-426b-8b5b-9f738599c32f req-4ec981d2-3ad1-4b19-bb24-76f0ad1493ed 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] No waiting events found dispatching network-vif-plugged-a7d8619c-08fc-4631-ae5e-d12856c1a1e1 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:19:43 compute-0 nova_compute[192698]: 2025-10-01 14:19:43.524 2 WARNING nova.compute.manager [req-28b2e2e5-fe63-426b-8b5b-9f738599c32f req-4ec981d2-3ad1-4b19-bb24-76f0ad1493ed 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] Received unexpected event network-vif-plugged-a7d8619c-08fc-4631-ae5e-d12856c1a1e1 for instance with vm_state building and task_state spawning.
Oct 01 14:19:43 compute-0 nova_compute[192698]: 2025-10-01 14:19:43.615 2 INFO nova.compute.manager [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] Took 8.81 seconds to spawn the instance on the hypervisor.
Oct 01 14:19:43 compute-0 nova_compute[192698]: 2025-10-01 14:19:43.615 2 DEBUG nova.compute.manager [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 01 14:19:43 compute-0 nova_compute[192698]: 2025-10-01 14:19:43.747 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:19:43 compute-0 nova_compute[192698]: 2025-10-01 14:19:43.926 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:19:44 compute-0 nova_compute[192698]: 2025-10-01 14:19:44.160 2 INFO nova.compute.manager [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] Took 14.19 seconds to build instance.
Oct 01 14:19:44 compute-0 nova_compute[192698]: 2025-10-01 14:19:44.666 2 DEBUG oslo_concurrency.lockutils [None req-d59c2df1-97ff-425a-9a3a-79c4926142ec f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "02793c05-e4d6-429f-827a-83af4ed29eaf" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.713s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:19:47 compute-0 podman[222459]: 2025-10-01 14:19:47.162465294 +0000 UTC m=+0.075626479 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, architecture=x86_64, distribution-scope=public, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm, vcs-type=git, io.buildah.version=1.33.7, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct 01 14:19:48 compute-0 nova_compute[192698]: 2025-10-01 14:19:48.749 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:19:48 compute-0 nova_compute[192698]: 2025-10-01 14:19:48.958 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:19:53 compute-0 podman[222491]: 2025-10-01 14:19:53.191283792 +0000 UTC m=+0.089775062 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, managed_by=edpm_ansible)
Oct 01 14:19:53 compute-0 podman[222492]: 2025-10-01 14:19:53.199596288 +0000 UTC m=+0.088896759 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Oct 01 14:19:53 compute-0 nova_compute[192698]: 2025-10-01 14:19:53.779 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:19:53 compute-0 nova_compute[192698]: 2025-10-01 14:19:53.960 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:19:55 compute-0 ovn_controller[94909]: 2025-10-01T14:19:55Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:08:53:17 10.100.0.5
Oct 01 14:19:55 compute-0 ovn_controller[94909]: 2025-10-01T14:19:55Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:08:53:17 10.100.0.5
Oct 01 14:19:57 compute-0 nova_compute[192698]: 2025-10-01 14:19:57.183 2 DEBUG nova.virt.libvirt.driver [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 64f0e219-6df7-4a26-b95b-90f93f33620e] Creating tmpfile /var/lib/nova/instances/tmpematsjw8 to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Oct 01 14:19:57 compute-0 nova_compute[192698]: 2025-10-01 14:19:57.185 2 WARNING neutronclient.v2_0.client [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:19:57 compute-0 nova_compute[192698]: 2025-10-01 14:19:57.294 2 DEBUG nova.compute.manager [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpematsjw8',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Oct 01 14:19:58 compute-0 nova_compute[192698]: 2025-10-01 14:19:58.782 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:19:58 compute-0 nova_compute[192698]: 2025-10-01 14:19:58.961 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:19:59 compute-0 podman[222533]: 2025-10-01 14:19:59.178746232 +0000 UTC m=+0.082515887 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 01 14:19:59 compute-0 nova_compute[192698]: 2025-10-01 14:19:59.372 2 WARNING neutronclient.v2_0.client [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:19:59 compute-0 podman[203144]: time="2025-10-01T14:19:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:19:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:19:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20750 "" "Go-http-client/1.1"
Oct 01 14:19:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:19:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3487 "" "Go-http-client/1.1"
Oct 01 14:20:01 compute-0 openstack_network_exporter[205307]: ERROR   14:20:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:20:01 compute-0 openstack_network_exporter[205307]: ERROR   14:20:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:20:01 compute-0 openstack_network_exporter[205307]: ERROR   14:20:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:20:01 compute-0 openstack_network_exporter[205307]: ERROR   14:20:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:20:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:20:01 compute-0 openstack_network_exporter[205307]: ERROR   14:20:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:20:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:20:03 compute-0 nova_compute[192698]: 2025-10-01 14:20:03.179 2 DEBUG nova.compute.manager [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpematsjw8',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='64f0e219-6df7-4a26-b95b-90f93f33620e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Oct 01 14:20:03 compute-0 nova_compute[192698]: 2025-10-01 14:20:03.788 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:20:03 compute-0 nova_compute[192698]: 2025-10-01 14:20:03.963 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:20:04 compute-0 nova_compute[192698]: 2025-10-01 14:20:04.197 2 DEBUG oslo_concurrency.lockutils [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "refresh_cache-64f0e219-6df7-4a26-b95b-90f93f33620e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 01 14:20:04 compute-0 nova_compute[192698]: 2025-10-01 14:20:04.198 2 DEBUG oslo_concurrency.lockutils [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquired lock "refresh_cache-64f0e219-6df7-4a26-b95b-90f93f33620e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 01 14:20:04 compute-0 nova_compute[192698]: 2025-10-01 14:20:04.198 2 DEBUG nova.network.neutron [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 64f0e219-6df7-4a26-b95b-90f93f33620e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 01 14:20:04 compute-0 nova_compute[192698]: 2025-10-01 14:20:04.705 2 WARNING neutronclient.v2_0.client [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:20:05 compute-0 nova_compute[192698]: 2025-10-01 14:20:05.595 2 WARNING neutronclient.v2_0.client [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:20:05 compute-0 nova_compute[192698]: 2025-10-01 14:20:05.775 2 DEBUG nova.network.neutron [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 64f0e219-6df7-4a26-b95b-90f93f33620e] Updating instance_info_cache with network_info: [{"id": "371e2b01-c6b3-4eb6-ae51-19962f1315ef", "address": "fa:16:3e:5f:4f:08", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap371e2b01-c6", "ovs_interfaceid": "371e2b01-c6b3-4eb6-ae51-19962f1315ef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:20:06 compute-0 nova_compute[192698]: 2025-10-01 14:20:06.284 2 DEBUG oslo_concurrency.lockutils [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Releasing lock "refresh_cache-64f0e219-6df7-4a26-b95b-90f93f33620e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 01 14:20:06 compute-0 nova_compute[192698]: 2025-10-01 14:20:06.303 2 DEBUG nova.virt.libvirt.driver [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 64f0e219-6df7-4a26-b95b-90f93f33620e] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpematsjw8',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='64f0e219-6df7-4a26-b95b-90f93f33620e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Oct 01 14:20:06 compute-0 nova_compute[192698]: 2025-10-01 14:20:06.304 2 DEBUG nova.virt.libvirt.driver [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 64f0e219-6df7-4a26-b95b-90f93f33620e] Creating instance directory: /var/lib/nova/instances/64f0e219-6df7-4a26-b95b-90f93f33620e pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Oct 01 14:20:06 compute-0 nova_compute[192698]: 2025-10-01 14:20:06.305 2 DEBUG nova.virt.libvirt.driver [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 64f0e219-6df7-4a26-b95b-90f93f33620e] Creating disk.info with the contents: {'/var/lib/nova/instances/64f0e219-6df7-4a26-b95b-90f93f33620e/disk': 'qcow2', '/var/lib/nova/instances/64f0e219-6df7-4a26-b95b-90f93f33620e/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Oct 01 14:20:06 compute-0 nova_compute[192698]: 2025-10-01 14:20:06.305 2 DEBUG nova.virt.libvirt.driver [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 64f0e219-6df7-4a26-b95b-90f93f33620e] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Oct 01 14:20:06 compute-0 nova_compute[192698]: 2025-10-01 14:20:06.306 2 DEBUG nova.objects.instance [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 64f0e219-6df7-4a26-b95b-90f93f33620e obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:20:06 compute-0 nova_compute[192698]: 2025-10-01 14:20:06.816 2 DEBUG oslo_utils.imageutils.format_inspector [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:20:06 compute-0 nova_compute[192698]: 2025-10-01 14:20:06.823 2 DEBUG oslo_utils.imageutils.format_inspector [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:20:06 compute-0 nova_compute[192698]: 2025-10-01 14:20:06.826 2 DEBUG oslo_concurrency.processutils [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:20:06 compute-0 nova_compute[192698]: 2025-10-01 14:20:06.928 2 DEBUG oslo_concurrency.processutils [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.102s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:20:06 compute-0 nova_compute[192698]: 2025-10-01 14:20:06.930 2 DEBUG oslo_concurrency.lockutils [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "f477473ce09fdc00484ca839f539813eb2fee546" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:20:06 compute-0 nova_compute[192698]: 2025-10-01 14:20:06.931 2 DEBUG oslo_concurrency.lockutils [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "f477473ce09fdc00484ca839f539813eb2fee546" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:20:06 compute-0 nova_compute[192698]: 2025-10-01 14:20:06.932 2 DEBUG oslo_utils.imageutils.format_inspector [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:20:06 compute-0 nova_compute[192698]: 2025-10-01 14:20:06.938 2 DEBUG oslo_utils.imageutils.format_inspector [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:20:06 compute-0 nova_compute[192698]: 2025-10-01 14:20:06.939 2 DEBUG oslo_concurrency.processutils [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:20:07 compute-0 nova_compute[192698]: 2025-10-01 14:20:07.032 2 DEBUG oslo_concurrency.processutils [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:20:07 compute-0 nova_compute[192698]: 2025-10-01 14:20:07.034 2 DEBUG oslo_concurrency.processutils [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546,backing_fmt=raw /var/lib/nova/instances/64f0e219-6df7-4a26-b95b-90f93f33620e/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:20:07 compute-0 nova_compute[192698]: 2025-10-01 14:20:07.102 2 DEBUG oslo_concurrency.processutils [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546,backing_fmt=raw /var/lib/nova/instances/64f0e219-6df7-4a26-b95b-90f93f33620e/disk 1073741824" returned: 0 in 0.069s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:20:07 compute-0 nova_compute[192698]: 2025-10-01 14:20:07.104 2 DEBUG oslo_concurrency.lockutils [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "f477473ce09fdc00484ca839f539813eb2fee546" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.173s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:20:07 compute-0 nova_compute[192698]: 2025-10-01 14:20:07.105 2 DEBUG oslo_concurrency.processutils [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:20:07 compute-0 nova_compute[192698]: 2025-10-01 14:20:07.196 2 DEBUG oslo_concurrency.processutils [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:20:07 compute-0 nova_compute[192698]: 2025-10-01 14:20:07.197 2 DEBUG nova.virt.disk.api [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Checking if we can resize image /var/lib/nova/instances/64f0e219-6df7-4a26-b95b-90f93f33620e/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 01 14:20:07 compute-0 nova_compute[192698]: 2025-10-01 14:20:07.198 2 DEBUG oslo_concurrency.processutils [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/64f0e219-6df7-4a26-b95b-90f93f33620e/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:20:07 compute-0 nova_compute[192698]: 2025-10-01 14:20:07.263 2 DEBUG oslo_concurrency.processutils [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/64f0e219-6df7-4a26-b95b-90f93f33620e/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:20:07 compute-0 nova_compute[192698]: 2025-10-01 14:20:07.264 2 DEBUG nova.virt.disk.api [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Cannot resize image /var/lib/nova/instances/64f0e219-6df7-4a26-b95b-90f93f33620e/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 01 14:20:07 compute-0 nova_compute[192698]: 2025-10-01 14:20:07.265 2 DEBUG nova.objects.instance [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lazy-loading 'migration_context' on Instance uuid 64f0e219-6df7-4a26-b95b-90f93f33620e obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:20:07 compute-0 nova_compute[192698]: 2025-10-01 14:20:07.776 2 DEBUG nova.objects.base [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Object Instance<64f0e219-6df7-4a26-b95b-90f93f33620e> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Oct 01 14:20:07 compute-0 nova_compute[192698]: 2025-10-01 14:20:07.777 2 DEBUG oslo_concurrency.processutils [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/64f0e219-6df7-4a26-b95b-90f93f33620e/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:20:07 compute-0 nova_compute[192698]: 2025-10-01 14:20:07.808 2 DEBUG oslo_concurrency.processutils [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/64f0e219-6df7-4a26-b95b-90f93f33620e/disk.config 497664" returned: 0 in 0.031s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:20:07 compute-0 nova_compute[192698]: 2025-10-01 14:20:07.809 2 DEBUG nova.virt.libvirt.driver [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 64f0e219-6df7-4a26-b95b-90f93f33620e] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Oct 01 14:20:07 compute-0 nova_compute[192698]: 2025-10-01 14:20:07.811 2 DEBUG nova.virt.libvirt.vif [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-10-01T14:19:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-2078744306',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-2078744306',id=18,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-01T14:19:21Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d43115e3729442e1b68b749acc0dabc8',ramdisk_id='',reservation_id='r-ryv8hzlj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-30131345',owner_user_name='tempest-TestExecuteStrategies-30131345-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-10-01T14:19:21Z,user_data=None,user_id='f8897741e6ca4770b56d28d05fa3fc42',uuid=64f0e219-6df7-4a26-b95b-90f93f33620e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "371e2b01-c6b3-4eb6-ae51-19962f1315ef", "address": "fa:16:3e:5f:4f:08", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap371e2b01-c6", "ovs_interfaceid": "371e2b01-c6b3-4eb6-ae51-19962f1315ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 01 14:20:07 compute-0 nova_compute[192698]: 2025-10-01 14:20:07.812 2 DEBUG nova.network.os_vif_util [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Converting VIF {"id": "371e2b01-c6b3-4eb6-ae51-19962f1315ef", "address": "fa:16:3e:5f:4f:08", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap371e2b01-c6", "ovs_interfaceid": "371e2b01-c6b3-4eb6-ae51-19962f1315ef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:20:07 compute-0 nova_compute[192698]: 2025-10-01 14:20:07.813 2 DEBUG nova.network.os_vif_util [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5f:4f:08,bridge_name='br-int',has_traffic_filtering=True,id=371e2b01-c6b3-4eb6-ae51-19962f1315ef,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap371e2b01-c6') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:20:07 compute-0 nova_compute[192698]: 2025-10-01 14:20:07.814 2 DEBUG os_vif [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5f:4f:08,bridge_name='br-int',has_traffic_filtering=True,id=371e2b01-c6b3-4eb6-ae51-19962f1315ef,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap371e2b01-c6') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 01 14:20:07 compute-0 nova_compute[192698]: 2025-10-01 14:20:07.815 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:20:07 compute-0 nova_compute[192698]: 2025-10-01 14:20:07.815 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:20:07 compute-0 nova_compute[192698]: 2025-10-01 14:20:07.816 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:20:07 compute-0 nova_compute[192698]: 2025-10-01 14:20:07.817 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:20:07 compute-0 nova_compute[192698]: 2025-10-01 14:20:07.817 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'ab29d71d-812f-5f4d-a3be-5bf8bd312c8a', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:20:07 compute-0 nova_compute[192698]: 2025-10-01 14:20:07.819 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:20:07 compute-0 nova_compute[192698]: 2025-10-01 14:20:07.822 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:20:07 compute-0 nova_compute[192698]: 2025-10-01 14:20:07.825 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:20:07 compute-0 nova_compute[192698]: 2025-10-01 14:20:07.826 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap371e2b01-c6, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:20:07 compute-0 nova_compute[192698]: 2025-10-01 14:20:07.826 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap371e2b01-c6, col_values=(('qos', UUID('b9cf8d9e-b27d-42be-b010-c5aa6241ef84')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:20:07 compute-0 nova_compute[192698]: 2025-10-01 14:20:07.827 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap371e2b01-c6, col_values=(('external_ids', {'iface-id': '371e2b01-c6b3-4eb6-ae51-19962f1315ef', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5f:4f:08', 'vm-uuid': '64f0e219-6df7-4a26-b95b-90f93f33620e'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:20:07 compute-0 nova_compute[192698]: 2025-10-01 14:20:07.828 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:20:07 compute-0 NetworkManager[51741]: <info>  [1759328407.8295] manager: (tap371e2b01-c6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/62)
Oct 01 14:20:07 compute-0 nova_compute[192698]: 2025-10-01 14:20:07.831 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:20:07 compute-0 nova_compute[192698]: 2025-10-01 14:20:07.835 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:20:07 compute-0 nova_compute[192698]: 2025-10-01 14:20:07.835 2 INFO os_vif [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5f:4f:08,bridge_name='br-int',has_traffic_filtering=True,id=371e2b01-c6b3-4eb6-ae51-19962f1315ef,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap371e2b01-c6')
Oct 01 14:20:07 compute-0 nova_compute[192698]: 2025-10-01 14:20:07.836 2 DEBUG nova.virt.libvirt.driver [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Oct 01 14:20:07 compute-0 nova_compute[192698]: 2025-10-01 14:20:07.836 2 DEBUG nova.compute.manager [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpematsjw8',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='64f0e219-6df7-4a26-b95b-90f93f33620e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Oct 01 14:20:07 compute-0 nova_compute[192698]: 2025-10-01 14:20:07.837 2 WARNING neutronclient.v2_0.client [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:20:08 compute-0 nova_compute[192698]: 2025-10-01 14:20:08.282 2 WARNING neutronclient.v2_0.client [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:20:08 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:20:08.557 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'e2:3f:3c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:1d:a6:67:ed:e6'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:20:08 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:20:08.559 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 01 14:20:08 compute-0 nova_compute[192698]: 2025-10-01 14:20:08.559 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:20:08 compute-0 nova_compute[192698]: 2025-10-01 14:20:08.966 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:20:09 compute-0 nova_compute[192698]: 2025-10-01 14:20:09.278 2 DEBUG nova.network.neutron [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 64f0e219-6df7-4a26-b95b-90f93f33620e] Port 371e2b01-c6b3-4eb6-ae51-19962f1315ef updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Oct 01 14:20:09 compute-0 nova_compute[192698]: 2025-10-01 14:20:09.311 2 DEBUG nova.compute.manager [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpematsjw8',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='64f0e219-6df7-4a26-b95b-90f93f33620e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Oct 01 14:20:09 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:20:09.561 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=10cf9814-09fa-4bad-879a-270f9b64eda3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:20:11 compute-0 podman[222579]: 2025-10-01 14:20:11.156255468 +0000 UTC m=+0.069409921 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Oct 01 14:20:11 compute-0 podman[222580]: 2025-10-01 14:20:11.183082585 +0000 UTC m=+0.096388072 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller)
Oct 01 14:20:11 compute-0 ovn_controller[94909]: 2025-10-01T14:20:11Z|00149|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Oct 01 14:20:12 compute-0 systemd[1]: Starting libvirt proxy daemon...
Oct 01 14:20:12 compute-0 systemd[1]: Started libvirt proxy daemon.
Oct 01 14:20:12 compute-0 kernel: tap371e2b01-c6: entered promiscuous mode
Oct 01 14:20:12 compute-0 NetworkManager[51741]: <info>  [1759328412.5435] manager: (tap371e2b01-c6): new Tun device (/org/freedesktop/NetworkManager/Devices/63)
Oct 01 14:20:12 compute-0 nova_compute[192698]: 2025-10-01 14:20:12.543 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:20:12 compute-0 ovn_controller[94909]: 2025-10-01T14:20:12Z|00150|binding|INFO|Claiming lport 371e2b01-c6b3-4eb6-ae51-19962f1315ef for this additional chassis.
Oct 01 14:20:12 compute-0 ovn_controller[94909]: 2025-10-01T14:20:12Z|00151|binding|INFO|371e2b01-c6b3-4eb6-ae51-19962f1315ef: Claiming fa:16:3e:5f:4f:08 10.100.0.4
Oct 01 14:20:12 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:20:12.552 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5f:4f:08 10.100.0.4'], port_security=['fa:16:3e:5f:4f:08 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '64f0e219-6df7-4a26-b95b-90f93f33620e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-031a8987-8430-4fb6-a464-01e4dca2fae7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd43115e3729442e1b68b749acc0dabc8', 'neutron:revision_number': '10', 'neutron:security_group_ids': '43a3232d-93b1-43af-a9a3-1fde49b4460d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bd1914da-f1b0-4097-9d6b-24a3870871dc, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=371e2b01-c6b3-4eb6-ae51-19962f1315ef) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:20:12 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:20:12.553 103791 INFO neutron.agent.ovn.metadata.agent [-] Port 371e2b01-c6b3-4eb6-ae51-19962f1315ef in datapath 031a8987-8430-4fb6-a464-01e4dca2fae7 unbound from our chassis
Oct 01 14:20:12 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:20:12.554 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 031a8987-8430-4fb6-a464-01e4dca2fae7
Oct 01 14:20:12 compute-0 ovn_controller[94909]: 2025-10-01T14:20:12Z|00152|binding|INFO|Setting lport 371e2b01-c6b3-4eb6-ae51-19962f1315ef ovn-installed in OVS
Oct 01 14:20:12 compute-0 nova_compute[192698]: 2025-10-01 14:20:12.564 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:20:12 compute-0 nova_compute[192698]: 2025-10-01 14:20:12.567 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:20:12 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:20:12.573 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[4d1b2cb7-e417-43ea-809c-6ad14ff2edf7]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:20:12 compute-0 systemd-machined[152704]: New machine qemu-14-instance-00000012.
Oct 01 14:20:12 compute-0 systemd[1]: Started Virtual Machine qemu-14-instance-00000012.
Oct 01 14:20:12 compute-0 systemd-udevd[222660]: Network interface NamePolicy= disabled on kernel command line.
Oct 01 14:20:12 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:20:12.615 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[4fbaf130-cb82-45ff-af43-1ebe94498d44]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:20:12 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:20:12.619 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[a016bab4-d040-4ff0-9687-e67e4e6c98ee]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:20:12 compute-0 NetworkManager[51741]: <info>  [1759328412.6342] device (tap371e2b01-c6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 01 14:20:12 compute-0 NetworkManager[51741]: <info>  [1759328412.6369] device (tap371e2b01-c6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 01 14:20:12 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:20:12.657 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[cd32d607-bc8f-41a5-95fd-b1bebe3ea60b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:20:12 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:20:12.676 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[46f39daf-24da-44ff-99ea-9f9f68fa339a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap031a8987-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:79:6c:81'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 474204, 'reachable_time': 30003, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222669, 'error': None, 'target': 'ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:20:12 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:20:12.697 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[43c4cdf3-1d1f-4612-b298-550592aa4cc0]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap031a8987-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 474219, 'tstamp': 474219}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222671, 'error': None, 'target': 'ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap031a8987-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 474223, 'tstamp': 474223}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222671, 'error': None, 'target': 'ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:20:12 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:20:12.699 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap031a8987-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:20:12 compute-0 nova_compute[192698]: 2025-10-01 14:20:12.701 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:20:12 compute-0 nova_compute[192698]: 2025-10-01 14:20:12.703 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:20:12 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:20:12.703 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap031a8987-80, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:20:12 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:20:12.703 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:20:12 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:20:12.703 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap031a8987-80, col_values=(('external_ids', {'iface-id': '6dd814dc-cba2-4392-85ef-eadb8c4615f7'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:20:12 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:20:12.704 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:20:12 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:20:12.705 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[65ef9aad-d839-4ccf-aced-a5f62b855c7f]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-031a8987-8430-4fb6-a464-01e4dca2fae7\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 031a8987-8430-4fb6-a464-01e4dca2fae7\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:20:12 compute-0 nova_compute[192698]: 2025-10-01 14:20:12.829 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:20:13 compute-0 nova_compute[192698]: 2025-10-01 14:20:13.970 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:20:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:20:14.270 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:20:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:20:14.272 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:20:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:20:14.274 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:20:15 compute-0 ovn_controller[94909]: 2025-10-01T14:20:15Z|00153|binding|INFO|Claiming lport 371e2b01-c6b3-4eb6-ae51-19962f1315ef for this chassis.
Oct 01 14:20:15 compute-0 ovn_controller[94909]: 2025-10-01T14:20:15Z|00154|binding|INFO|371e2b01-c6b3-4eb6-ae51-19962f1315ef: Claiming fa:16:3e:5f:4f:08 10.100.0.4
Oct 01 14:20:15 compute-0 ovn_controller[94909]: 2025-10-01T14:20:15Z|00155|binding|INFO|Setting lport 371e2b01-c6b3-4eb6-ae51-19962f1315ef up in Southbound
Oct 01 14:20:16 compute-0 nova_compute[192698]: 2025-10-01 14:20:16.801 2 INFO nova.compute.manager [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 64f0e219-6df7-4a26-b95b-90f93f33620e] Post operation of migration started
Oct 01 14:20:16 compute-0 nova_compute[192698]: 2025-10-01 14:20:16.802 2 WARNING neutronclient.v2_0.client [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:20:17 compute-0 nova_compute[192698]: 2025-10-01 14:20:17.319 2 WARNING neutronclient.v2_0.client [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:20:17 compute-0 nova_compute[192698]: 2025-10-01 14:20:17.320 2 WARNING neutronclient.v2_0.client [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:20:17 compute-0 nova_compute[192698]: 2025-10-01 14:20:17.481 2 DEBUG oslo_concurrency.lockutils [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "refresh_cache-64f0e219-6df7-4a26-b95b-90f93f33620e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 01 14:20:17 compute-0 nova_compute[192698]: 2025-10-01 14:20:17.482 2 DEBUG oslo_concurrency.lockutils [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquired lock "refresh_cache-64f0e219-6df7-4a26-b95b-90f93f33620e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 01 14:20:17 compute-0 nova_compute[192698]: 2025-10-01 14:20:17.482 2 DEBUG nova.network.neutron [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 64f0e219-6df7-4a26-b95b-90f93f33620e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 01 14:20:17 compute-0 nova_compute[192698]: 2025-10-01 14:20:17.832 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:20:17 compute-0 nova_compute[192698]: 2025-10-01 14:20:17.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:20:17 compute-0 nova_compute[192698]: 2025-10-01 14:20:17.988 2 WARNING neutronclient.v2_0.client [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:20:18 compute-0 podman[222696]: 2025-10-01 14:20:18.20683129 +0000 UTC m=+0.102086995 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, config_id=edpm, architecture=x86_64, io.buildah.version=1.33.7, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc., vcs-type=git, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct 01 14:20:18 compute-0 nova_compute[192698]: 2025-10-01 14:20:18.438 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:20:18 compute-0 nova_compute[192698]: 2025-10-01 14:20:18.439 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:20:18 compute-0 nova_compute[192698]: 2025-10-01 14:20:18.439 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:20:18 compute-0 nova_compute[192698]: 2025-10-01 14:20:18.439 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 01 14:20:18 compute-0 nova_compute[192698]: 2025-10-01 14:20:18.644 2 WARNING neutronclient.v2_0.client [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:20:18 compute-0 nova_compute[192698]: 2025-10-01 14:20:18.820 2 DEBUG nova.network.neutron [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 64f0e219-6df7-4a26-b95b-90f93f33620e] Updating instance_info_cache with network_info: [{"id": "371e2b01-c6b3-4eb6-ae51-19962f1315ef", "address": "fa:16:3e:5f:4f:08", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap371e2b01-c6", "ovs_interfaceid": "371e2b01-c6b3-4eb6-ae51-19962f1315ef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:20:18 compute-0 nova_compute[192698]: 2025-10-01 14:20:18.972 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:20:19 compute-0 nova_compute[192698]: 2025-10-01 14:20:19.329 2 DEBUG oslo_concurrency.lockutils [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Releasing lock "refresh_cache-64f0e219-6df7-4a26-b95b-90f93f33620e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 01 14:20:19 compute-0 nova_compute[192698]: 2025-10-01 14:20:19.503 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/64f0e219-6df7-4a26-b95b-90f93f33620e/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:20:19 compute-0 nova_compute[192698]: 2025-10-01 14:20:19.591 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/64f0e219-6df7-4a26-b95b-90f93f33620e/disk --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:20:19 compute-0 nova_compute[192698]: 2025-10-01 14:20:19.592 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/64f0e219-6df7-4a26-b95b-90f93f33620e/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:20:19 compute-0 nova_compute[192698]: 2025-10-01 14:20:19.658 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/64f0e219-6df7-4a26-b95b-90f93f33620e/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:20:19 compute-0 nova_compute[192698]: 2025-10-01 14:20:19.665 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/02793c05-e4d6-429f-827a-83af4ed29eaf/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:20:19 compute-0 nova_compute[192698]: 2025-10-01 14:20:19.725 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/02793c05-e4d6-429f-827a-83af4ed29eaf/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:20:19 compute-0 nova_compute[192698]: 2025-10-01 14:20:19.726 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/02793c05-e4d6-429f-827a-83af4ed29eaf/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:20:19 compute-0 nova_compute[192698]: 2025-10-01 14:20:19.778 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/02793c05-e4d6-429f-827a-83af4ed29eaf/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:20:19 compute-0 nova_compute[192698]: 2025-10-01 14:20:19.854 2 DEBUG oslo_concurrency.lockutils [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:20:19 compute-0 nova_compute[192698]: 2025-10-01 14:20:19.855 2 DEBUG oslo_concurrency.lockutils [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:20:19 compute-0 nova_compute[192698]: 2025-10-01 14:20:19.856 2 DEBUG oslo_concurrency.lockutils [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:20:19 compute-0 nova_compute[192698]: 2025-10-01 14:20:19.859 2 INFO nova.virt.libvirt.driver [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 64f0e219-6df7-4a26-b95b-90f93f33620e] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Oct 01 14:20:19 compute-0 virtqemud[192597]: Domain id=14 name='instance-00000012' uuid=64f0e219-6df7-4a26-b95b-90f93f33620e is tainted: custom-monitor
Oct 01 14:20:19 compute-0 nova_compute[192698]: 2025-10-01 14:20:19.945 2 WARNING nova.virt.libvirt.driver [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:20:19 compute-0 nova_compute[192698]: 2025-10-01 14:20:19.946 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:20:19 compute-0 nova_compute[192698]: 2025-10-01 14:20:19.980 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.034s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:20:19 compute-0 nova_compute[192698]: 2025-10-01 14:20:19.981 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5498MB free_disk=73.24555969238281GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 01 14:20:19 compute-0 nova_compute[192698]: 2025-10-01 14:20:19.981 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:20:19 compute-0 nova_compute[192698]: 2025-10-01 14:20:19.982 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:20:20 compute-0 nova_compute[192698]: 2025-10-01 14:20:20.866 2 INFO nova.virt.libvirt.driver [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 64f0e219-6df7-4a26-b95b-90f93f33620e] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Oct 01 14:20:21 compute-0 nova_compute[192698]: 2025-10-01 14:20:21.003 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Migration for instance 64f0e219-6df7-4a26-b95b-90f93f33620e refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Oct 01 14:20:21 compute-0 nova_compute[192698]: 2025-10-01 14:20:21.513 2 INFO nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] [instance: 64f0e219-6df7-4a26-b95b-90f93f33620e] Updating resource usage from migration 17619145-a8a9-4a17-8a14-d97744134909
Oct 01 14:20:21 compute-0 nova_compute[192698]: 2025-10-01 14:20:21.514 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] [instance: 64f0e219-6df7-4a26-b95b-90f93f33620e] Starting to track incoming migration 17619145-a8a9-4a17-8a14-d97744134909 with flavor 69702c4b-38f2-49d1-96d5-85671652c67e _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Oct 01 14:20:21 compute-0 nova_compute[192698]: 2025-10-01 14:20:21.875 2 INFO nova.virt.libvirt.driver [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 64f0e219-6df7-4a26-b95b-90f93f33620e] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Oct 01 14:20:21 compute-0 nova_compute[192698]: 2025-10-01 14:20:21.879 2 DEBUG nova.compute.manager [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 64f0e219-6df7-4a26-b95b-90f93f33620e] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 01 14:20:22 compute-0 nova_compute[192698]: 2025-10-01 14:20:22.052 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Instance 02793c05-e4d6-429f-827a-83af4ed29eaf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Oct 01 14:20:22 compute-0 nova_compute[192698]: 2025-10-01 14:20:22.388 2 DEBUG nova.objects.instance [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 64f0e219-6df7-4a26-b95b-90f93f33620e] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Oct 01 14:20:22 compute-0 nova_compute[192698]: 2025-10-01 14:20:22.559 2 WARNING nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Instance 64f0e219-6df7-4a26-b95b-90f93f33620e has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Oct 01 14:20:22 compute-0 nova_compute[192698]: 2025-10-01 14:20:22.560 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 01 14:20:22 compute-0 nova_compute[192698]: 2025-10-01 14:20:22.560 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:20:19 up  1:19,  0 user,  load average: 0.26, 0.24, 0.33\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_d43115e3729442e1b68b749acc0dabc8': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 01 14:20:22 compute-0 nova_compute[192698]: 2025-10-01 14:20:22.629 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:20:22 compute-0 nova_compute[192698]: 2025-10-01 14:20:22.835 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:20:23 compute-0 nova_compute[192698]: 2025-10-01 14:20:23.137 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:20:23 compute-0 nova_compute[192698]: 2025-10-01 14:20:23.416 2 WARNING neutronclient.v2_0.client [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:20:23 compute-0 nova_compute[192698]: 2025-10-01 14:20:23.657 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 01 14:20:23 compute-0 nova_compute[192698]: 2025-10-01 14:20:23.658 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.676s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:20:23 compute-0 nova_compute[192698]: 2025-10-01 14:20:23.975 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:20:24 compute-0 podman[222732]: 2025-10-01 14:20:24.188981516 +0000 UTC m=+0.092162096 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid)
Oct 01 14:20:24 compute-0 podman[222733]: 2025-10-01 14:20:24.204859525 +0000 UTC m=+0.097690036 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 01 14:20:24 compute-0 nova_compute[192698]: 2025-10-01 14:20:24.315 2 WARNING neutronclient.v2_0.client [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:20:24 compute-0 nova_compute[192698]: 2025-10-01 14:20:24.316 2 WARNING neutronclient.v2_0.client [None req-c40fa24e-59aa-4175-b112-fe45fba23c10 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:20:24 compute-0 nova_compute[192698]: 2025-10-01 14:20:24.659 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:20:24 compute-0 nova_compute[192698]: 2025-10-01 14:20:24.659 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:20:24 compute-0 nova_compute[192698]: 2025-10-01 14:20:24.660 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:20:24 compute-0 nova_compute[192698]: 2025-10-01 14:20:24.660 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:20:26 compute-0 nova_compute[192698]: 2025-10-01 14:20:26.914 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:20:27 compute-0 nova_compute[192698]: 2025-10-01 14:20:27.837 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:20:28 compute-0 nova_compute[192698]: 2025-10-01 14:20:28.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:20:28 compute-0 nova_compute[192698]: 2025-10-01 14:20:28.977 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:20:29 compute-0 podman[203144]: time="2025-10-01T14:20:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:20:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:20:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20750 "" "Go-http-client/1.1"
Oct 01 14:20:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:20:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3482 "" "Go-http-client/1.1"
Oct 01 14:20:29 compute-0 nova_compute[192698]: 2025-10-01 14:20:29.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:20:29 compute-0 nova_compute[192698]: 2025-10-01 14:20:29.926 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 01 14:20:30 compute-0 podman[222773]: 2025-10-01 14:20:30.180265487 +0000 UTC m=+0.086839873 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 01 14:20:31 compute-0 openstack_network_exporter[205307]: ERROR   14:20:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:20:31 compute-0 openstack_network_exporter[205307]: ERROR   14:20:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:20:31 compute-0 openstack_network_exporter[205307]: ERROR   14:20:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:20:31 compute-0 openstack_network_exporter[205307]: ERROR   14:20:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:20:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:20:31 compute-0 openstack_network_exporter[205307]: ERROR   14:20:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:20:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:20:31 compute-0 nova_compute[192698]: 2025-10-01 14:20:31.560 2 DEBUG oslo_concurrency.lockutils [None req-adcd09c2-88d6-4b38-97f0-bcefadc6a644 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Acquiring lock "02793c05-e4d6-429f-827a-83af4ed29eaf" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:20:31 compute-0 nova_compute[192698]: 2025-10-01 14:20:31.561 2 DEBUG oslo_concurrency.lockutils [None req-adcd09c2-88d6-4b38-97f0-bcefadc6a644 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "02793c05-e4d6-429f-827a-83af4ed29eaf" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:20:31 compute-0 nova_compute[192698]: 2025-10-01 14:20:31.562 2 DEBUG oslo_concurrency.lockutils [None req-adcd09c2-88d6-4b38-97f0-bcefadc6a644 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Acquiring lock "02793c05-e4d6-429f-827a-83af4ed29eaf-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:20:31 compute-0 nova_compute[192698]: 2025-10-01 14:20:31.562 2 DEBUG oslo_concurrency.lockutils [None req-adcd09c2-88d6-4b38-97f0-bcefadc6a644 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "02793c05-e4d6-429f-827a-83af4ed29eaf-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:20:31 compute-0 nova_compute[192698]: 2025-10-01 14:20:31.563 2 DEBUG oslo_concurrency.lockutils [None req-adcd09c2-88d6-4b38-97f0-bcefadc6a644 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "02793c05-e4d6-429f-827a-83af4ed29eaf-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:20:31 compute-0 nova_compute[192698]: 2025-10-01 14:20:31.581 2 INFO nova.compute.manager [None req-adcd09c2-88d6-4b38-97f0-bcefadc6a644 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] Terminating instance
Oct 01 14:20:32 compute-0 nova_compute[192698]: 2025-10-01 14:20:32.105 2 DEBUG nova.compute.manager [None req-adcd09c2-88d6-4b38-97f0-bcefadc6a644 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 01 14:20:32 compute-0 kernel: tapa7d8619c-08 (unregistering): left promiscuous mode
Oct 01 14:20:32 compute-0 NetworkManager[51741]: <info>  [1759328432.1400] device (tapa7d8619c-08): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 01 14:20:32 compute-0 ovn_controller[94909]: 2025-10-01T14:20:32Z|00156|binding|INFO|Releasing lport a7d8619c-08fc-4631-ae5e-d12856c1a1e1 from this chassis (sb_readonly=0)
Oct 01 14:20:32 compute-0 ovn_controller[94909]: 2025-10-01T14:20:32Z|00157|binding|INFO|Setting lport a7d8619c-08fc-4631-ae5e-d12856c1a1e1 down in Southbound
Oct 01 14:20:32 compute-0 nova_compute[192698]: 2025-10-01 14:20:32.154 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:20:32 compute-0 ovn_controller[94909]: 2025-10-01T14:20:32Z|00158|binding|INFO|Removing iface tapa7d8619c-08 ovn-installed in OVS
Oct 01 14:20:32 compute-0 nova_compute[192698]: 2025-10-01 14:20:32.158 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:20:32 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:20:32.168 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:08:53:17 10.100.0.5'], port_security=['fa:16:3e:08:53:17 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '02793c05-e4d6-429f-827a-83af4ed29eaf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-031a8987-8430-4fb6-a464-01e4dca2fae7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd43115e3729442e1b68b749acc0dabc8', 'neutron:revision_number': '5', 'neutron:security_group_ids': '43a3232d-93b1-43af-a9a3-1fde49b4460d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bd1914da-f1b0-4097-9d6b-24a3870871dc, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], logical_port=a7d8619c-08fc-4631-ae5e-d12856c1a1e1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:20:32 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:20:32.170 103791 INFO neutron.agent.ovn.metadata.agent [-] Port a7d8619c-08fc-4631-ae5e-d12856c1a1e1 in datapath 031a8987-8430-4fb6-a464-01e4dca2fae7 unbound from our chassis
Oct 01 14:20:32 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:20:32.172 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 031a8987-8430-4fb6-a464-01e4dca2fae7
Oct 01 14:20:32 compute-0 nova_compute[192698]: 2025-10-01 14:20:32.182 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:20:32 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:20:32.205 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[a078a046-7bcf-4285-bd45-14c205594910]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:20:32 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000013.scope: Deactivated successfully.
Oct 01 14:20:32 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000013.scope: Consumed 15.051s CPU time.
Oct 01 14:20:32 compute-0 systemd-machined[152704]: Machine qemu-13-instance-00000013 terminated.
Oct 01 14:20:32 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:20:32.253 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[daf44a26-b4e8-4db1-a7e5-2767d6decd03]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:20:32 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:20:32.256 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[92fb1aba-579c-4f80-a0ef-8414d4c4e050]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:20:32 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:20:32.302 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[a94a5524-6f9d-4651-b48c-1d0b8164bc53]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:20:32 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:20:32.336 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[7c3344d8-bc74-4449-b7d8-d06697c546fb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap031a8987-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:79:6c:81'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 474204, 'reachable_time': 30003, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222809, 'error': None, 'target': 'ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:20:32 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:20:32.362 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[121a4345-a744-4f46-89c7-57e8f5484749]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap031a8987-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 474219, 'tstamp': 474219}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222815, 'error': None, 'target': 'ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap031a8987-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 474223, 'tstamp': 474223}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222815, 'error': None, 'target': 'ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:20:32 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:20:32.364 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap031a8987-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:20:32 compute-0 nova_compute[192698]: 2025-10-01 14:20:32.373 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:20:32 compute-0 nova_compute[192698]: 2025-10-01 14:20:32.379 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:20:32 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:20:32.379 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap031a8987-80, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:20:32 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:20:32.379 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:20:32 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:20:32.380 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap031a8987-80, col_values=(('external_ids', {'iface-id': '6dd814dc-cba2-4392-85ef-eadb8c4615f7'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:20:32 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:20:32.380 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:20:32 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:20:32.382 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[bf435d5c-f063-4c63-8a90-0b0d0c76f0d3]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-031a8987-8430-4fb6-a464-01e4dca2fae7\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 031a8987-8430-4fb6-a464-01e4dca2fae7\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:20:32 compute-0 nova_compute[192698]: 2025-10-01 14:20:32.398 2 INFO nova.virt.libvirt.driver [-] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] Instance destroyed successfully.
Oct 01 14:20:32 compute-0 nova_compute[192698]: 2025-10-01 14:20:32.399 2 DEBUG nova.objects.instance [None req-adcd09c2-88d6-4b38-97f0-bcefadc6a644 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lazy-loading 'resources' on Instance uuid 02793c05-e4d6-429f-827a-83af4ed29eaf obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:20:32 compute-0 nova_compute[192698]: 2025-10-01 14:20:32.424 2 DEBUG nova.compute.manager [req-64da93ed-76e2-459f-b12d-ef691f638e2e req-6046d6a4-3ee9-44a8-b047-8649ffb643db 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] Received event network-vif-unplugged-a7d8619c-08fc-4631-ae5e-d12856c1a1e1 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:20:32 compute-0 nova_compute[192698]: 2025-10-01 14:20:32.424 2 DEBUG oslo_concurrency.lockutils [req-64da93ed-76e2-459f-b12d-ef691f638e2e req-6046d6a4-3ee9-44a8-b047-8649ffb643db 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "02793c05-e4d6-429f-827a-83af4ed29eaf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:20:32 compute-0 nova_compute[192698]: 2025-10-01 14:20:32.425 2 DEBUG oslo_concurrency.lockutils [req-64da93ed-76e2-459f-b12d-ef691f638e2e req-6046d6a4-3ee9-44a8-b047-8649ffb643db 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "02793c05-e4d6-429f-827a-83af4ed29eaf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:20:32 compute-0 nova_compute[192698]: 2025-10-01 14:20:32.425 2 DEBUG oslo_concurrency.lockutils [req-64da93ed-76e2-459f-b12d-ef691f638e2e req-6046d6a4-3ee9-44a8-b047-8649ffb643db 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "02793c05-e4d6-429f-827a-83af4ed29eaf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:20:32 compute-0 nova_compute[192698]: 2025-10-01 14:20:32.425 2 DEBUG nova.compute.manager [req-64da93ed-76e2-459f-b12d-ef691f638e2e req-6046d6a4-3ee9-44a8-b047-8649ffb643db 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] No waiting events found dispatching network-vif-unplugged-a7d8619c-08fc-4631-ae5e-d12856c1a1e1 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:20:32 compute-0 nova_compute[192698]: 2025-10-01 14:20:32.426 2 DEBUG nova.compute.manager [req-64da93ed-76e2-459f-b12d-ef691f638e2e req-6046d6a4-3ee9-44a8-b047-8649ffb643db 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] Received event network-vif-unplugged-a7d8619c-08fc-4631-ae5e-d12856c1a1e1 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 01 14:20:32 compute-0 nova_compute[192698]: 2025-10-01 14:20:32.841 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:20:32 compute-0 nova_compute[192698]: 2025-10-01 14:20:32.928 2 DEBUG nova.virt.libvirt.vif [None req-adcd09c2-88d6-4b38-97f0-bcefadc6a644 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-10-01T14:19:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-462029158',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-462029158',id=19,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-01T14:19:43Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d43115e3729442e1b68b749acc0dabc8',ramdisk_id='',reservation_id='r-n7j4s4mz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-30131345',owner_user_name='tempest-TestExecuteStrategies-30131345-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-01T14:19:43Z,user_data=None,user_id='f8897741e6ca4770b56d28d05fa3fc42',uuid=02793c05-e4d6-429f-827a-83af4ed29eaf,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a7d8619c-08fc-4631-ae5e-d12856c1a1e1", "address": "fa:16:3e:08:53:17", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa7d8619c-08", "ovs_interfaceid": "a7d8619c-08fc-4631-ae5e-d12856c1a1e1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 01 14:20:32 compute-0 nova_compute[192698]: 2025-10-01 14:20:32.929 2 DEBUG nova.network.os_vif_util [None req-adcd09c2-88d6-4b38-97f0-bcefadc6a644 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Converting VIF {"id": "a7d8619c-08fc-4631-ae5e-d12856c1a1e1", "address": "fa:16:3e:08:53:17", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa7d8619c-08", "ovs_interfaceid": "a7d8619c-08fc-4631-ae5e-d12856c1a1e1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:20:32 compute-0 nova_compute[192698]: 2025-10-01 14:20:32.930 2 DEBUG nova.network.os_vif_util [None req-adcd09c2-88d6-4b38-97f0-bcefadc6a644 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:08:53:17,bridge_name='br-int',has_traffic_filtering=True,id=a7d8619c-08fc-4631-ae5e-d12856c1a1e1,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa7d8619c-08') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:20:32 compute-0 nova_compute[192698]: 2025-10-01 14:20:32.931 2 DEBUG os_vif [None req-adcd09c2-88d6-4b38-97f0-bcefadc6a644 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:08:53:17,bridge_name='br-int',has_traffic_filtering=True,id=a7d8619c-08fc-4631-ae5e-d12856c1a1e1,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa7d8619c-08') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 01 14:20:32 compute-0 nova_compute[192698]: 2025-10-01 14:20:32.932 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:20:32 compute-0 nova_compute[192698]: 2025-10-01 14:20:32.933 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa7d8619c-08, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:20:32 compute-0 nova_compute[192698]: 2025-10-01 14:20:32.935 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:20:32 compute-0 nova_compute[192698]: 2025-10-01 14:20:32.936 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:20:32 compute-0 nova_compute[192698]: 2025-10-01 14:20:32.937 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=84dc1b24-55df-4824-8291-77560c76a3fe) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:20:32 compute-0 nova_compute[192698]: 2025-10-01 14:20:32.938 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:20:32 compute-0 nova_compute[192698]: 2025-10-01 14:20:32.941 2 INFO os_vif [None req-adcd09c2-88d6-4b38-97f0-bcefadc6a644 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:08:53:17,bridge_name='br-int',has_traffic_filtering=True,id=a7d8619c-08fc-4631-ae5e-d12856c1a1e1,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa7d8619c-08')
Oct 01 14:20:32 compute-0 nova_compute[192698]: 2025-10-01 14:20:32.942 2 INFO nova.virt.libvirt.driver [None req-adcd09c2-88d6-4b38-97f0-bcefadc6a644 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] Deleting instance files /var/lib/nova/instances/02793c05-e4d6-429f-827a-83af4ed29eaf_del
Oct 01 14:20:32 compute-0 nova_compute[192698]: 2025-10-01 14:20:32.943 2 INFO nova.virt.libvirt.driver [None req-adcd09c2-88d6-4b38-97f0-bcefadc6a644 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] Deletion of /var/lib/nova/instances/02793c05-e4d6-429f-827a-83af4ed29eaf_del complete
Oct 01 14:20:33 compute-0 nova_compute[192698]: 2025-10-01 14:20:33.458 2 INFO nova.compute.manager [None req-adcd09c2-88d6-4b38-97f0-bcefadc6a644 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] Took 1.35 seconds to destroy the instance on the hypervisor.
Oct 01 14:20:33 compute-0 nova_compute[192698]: 2025-10-01 14:20:33.458 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-adcd09c2-88d6-4b38-97f0-bcefadc6a644 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 01 14:20:33 compute-0 nova_compute[192698]: 2025-10-01 14:20:33.459 2 DEBUG nova.compute.manager [-] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 01 14:20:33 compute-0 nova_compute[192698]: 2025-10-01 14:20:33.459 2 DEBUG nova.network.neutron [-] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 01 14:20:33 compute-0 nova_compute[192698]: 2025-10-01 14:20:33.460 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:20:33 compute-0 nova_compute[192698]: 2025-10-01 14:20:33.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:20:34 compute-0 nova_compute[192698]: 2025-10-01 14:20:34.313 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:20:34 compute-0 nova_compute[192698]: 2025-10-01 14:20:34.482 2 DEBUG nova.compute.manager [req-6e0328a8-8e3b-44bc-91cc-70b7cf6fcf1d req-5dbbd62d-8b56-4f3e-b47a-4ca2ffdc242c 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] Received event network-vif-unplugged-a7d8619c-08fc-4631-ae5e-d12856c1a1e1 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:20:34 compute-0 nova_compute[192698]: 2025-10-01 14:20:34.483 2 DEBUG oslo_concurrency.lockutils [req-6e0328a8-8e3b-44bc-91cc-70b7cf6fcf1d req-5dbbd62d-8b56-4f3e-b47a-4ca2ffdc242c 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "02793c05-e4d6-429f-827a-83af4ed29eaf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:20:34 compute-0 nova_compute[192698]: 2025-10-01 14:20:34.483 2 DEBUG oslo_concurrency.lockutils [req-6e0328a8-8e3b-44bc-91cc-70b7cf6fcf1d req-5dbbd62d-8b56-4f3e-b47a-4ca2ffdc242c 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "02793c05-e4d6-429f-827a-83af4ed29eaf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:20:34 compute-0 nova_compute[192698]: 2025-10-01 14:20:34.483 2 DEBUG oslo_concurrency.lockutils [req-6e0328a8-8e3b-44bc-91cc-70b7cf6fcf1d req-5dbbd62d-8b56-4f3e-b47a-4ca2ffdc242c 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "02793c05-e4d6-429f-827a-83af4ed29eaf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:20:34 compute-0 nova_compute[192698]: 2025-10-01 14:20:34.483 2 DEBUG nova.compute.manager [req-6e0328a8-8e3b-44bc-91cc-70b7cf6fcf1d req-5dbbd62d-8b56-4f3e-b47a-4ca2ffdc242c 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] No waiting events found dispatching network-vif-unplugged-a7d8619c-08fc-4631-ae5e-d12856c1a1e1 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:20:34 compute-0 nova_compute[192698]: 2025-10-01 14:20:34.483 2 DEBUG nova.compute.manager [req-6e0328a8-8e3b-44bc-91cc-70b7cf6fcf1d req-5dbbd62d-8b56-4f3e-b47a-4ca2ffdc242c 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] Received event network-vif-unplugged-a7d8619c-08fc-4631-ae5e-d12856c1a1e1 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 01 14:20:34 compute-0 nova_compute[192698]: 2025-10-01 14:20:34.629 2 DEBUG nova.compute.manager [req-d8c37a53-4dfd-4ed1-9211-620f5cdfe1b5 req-a53d7eb5-30ea-49cf-9380-8897a8b20d7b 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] Received event network-vif-deleted-a7d8619c-08fc-4631-ae5e-d12856c1a1e1 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:20:34 compute-0 nova_compute[192698]: 2025-10-01 14:20:34.629 2 INFO nova.compute.manager [req-d8c37a53-4dfd-4ed1-9211-620f5cdfe1b5 req-a53d7eb5-30ea-49cf-9380-8897a8b20d7b 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] Neutron deleted interface a7d8619c-08fc-4631-ae5e-d12856c1a1e1; detaching it from the instance and deleting it from the info cache
Oct 01 14:20:34 compute-0 nova_compute[192698]: 2025-10-01 14:20:34.630 2 DEBUG nova.network.neutron [req-d8c37a53-4dfd-4ed1-9211-620f5cdfe1b5 req-a53d7eb5-30ea-49cf-9380-8897a8b20d7b 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:20:35 compute-0 nova_compute[192698]: 2025-10-01 14:20:35.067 2 DEBUG nova.network.neutron [-] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:20:35 compute-0 nova_compute[192698]: 2025-10-01 14:20:35.136 2 DEBUG nova.compute.manager [req-d8c37a53-4dfd-4ed1-9211-620f5cdfe1b5 req-a53d7eb5-30ea-49cf-9380-8897a8b20d7b 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] Detach interface failed, port_id=a7d8619c-08fc-4631-ae5e-d12856c1a1e1, reason: Instance 02793c05-e4d6-429f-827a-83af4ed29eaf could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Oct 01 14:20:35 compute-0 nova_compute[192698]: 2025-10-01 14:20:35.573 2 INFO nova.compute.manager [-] [instance: 02793c05-e4d6-429f-827a-83af4ed29eaf] Took 2.11 seconds to deallocate network for instance.
Oct 01 14:20:36 compute-0 nova_compute[192698]: 2025-10-01 14:20:36.109 2 DEBUG oslo_concurrency.lockutils [None req-adcd09c2-88d6-4b38-97f0-bcefadc6a644 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:20:36 compute-0 nova_compute[192698]: 2025-10-01 14:20:36.109 2 DEBUG oslo_concurrency.lockutils [None req-adcd09c2-88d6-4b38-97f0-bcefadc6a644 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:20:36 compute-0 nova_compute[192698]: 2025-10-01 14:20:36.179 2 DEBUG nova.compute.provider_tree [None req-adcd09c2-88d6-4b38-97f0-bcefadc6a644 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:20:36 compute-0 nova_compute[192698]: 2025-10-01 14:20:36.690 2 DEBUG nova.scheduler.client.report [None req-adcd09c2-88d6-4b38-97f0-bcefadc6a644 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:20:37 compute-0 nova_compute[192698]: 2025-10-01 14:20:37.199 2 DEBUG oslo_concurrency.lockutils [None req-adcd09c2-88d6-4b38-97f0-bcefadc6a644 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.090s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:20:37 compute-0 nova_compute[192698]: 2025-10-01 14:20:37.227 2 INFO nova.scheduler.client.report [None req-adcd09c2-88d6-4b38-97f0-bcefadc6a644 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Deleted allocations for instance 02793c05-e4d6-429f-827a-83af4ed29eaf
Oct 01 14:20:37 compute-0 nova_compute[192698]: 2025-10-01 14:20:37.939 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:20:38 compute-0 nova_compute[192698]: 2025-10-01 14:20:38.265 2 DEBUG oslo_concurrency.lockutils [None req-adcd09c2-88d6-4b38-97f0-bcefadc6a644 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "02793c05-e4d6-429f-827a-83af4ed29eaf" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.703s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:20:38 compute-0 nova_compute[192698]: 2025-10-01 14:20:38.925 2 DEBUG oslo_concurrency.lockutils [None req-673f88ac-92e1-46f2-9d4f-e7f49d03a658 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Acquiring lock "64f0e219-6df7-4a26-b95b-90f93f33620e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:20:38 compute-0 nova_compute[192698]: 2025-10-01 14:20:38.926 2 DEBUG oslo_concurrency.lockutils [None req-673f88ac-92e1-46f2-9d4f-e7f49d03a658 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "64f0e219-6df7-4a26-b95b-90f93f33620e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:20:38 compute-0 nova_compute[192698]: 2025-10-01 14:20:38.926 2 DEBUG oslo_concurrency.lockutils [None req-673f88ac-92e1-46f2-9d4f-e7f49d03a658 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Acquiring lock "64f0e219-6df7-4a26-b95b-90f93f33620e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:20:38 compute-0 nova_compute[192698]: 2025-10-01 14:20:38.926 2 DEBUG oslo_concurrency.lockutils [None req-673f88ac-92e1-46f2-9d4f-e7f49d03a658 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "64f0e219-6df7-4a26-b95b-90f93f33620e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:20:38 compute-0 nova_compute[192698]: 2025-10-01 14:20:38.926 2 DEBUG oslo_concurrency.lockutils [None req-673f88ac-92e1-46f2-9d4f-e7f49d03a658 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "64f0e219-6df7-4a26-b95b-90f93f33620e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:20:38 compute-0 nova_compute[192698]: 2025-10-01 14:20:38.937 2 INFO nova.compute.manager [None req-673f88ac-92e1-46f2-9d4f-e7f49d03a658 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 64f0e219-6df7-4a26-b95b-90f93f33620e] Terminating instance
Oct 01 14:20:38 compute-0 nova_compute[192698]: 2025-10-01 14:20:38.982 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:20:39 compute-0 nova_compute[192698]: 2025-10-01 14:20:39.451 2 DEBUG nova.compute.manager [None req-673f88ac-92e1-46f2-9d4f-e7f49d03a658 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 64f0e219-6df7-4a26-b95b-90f93f33620e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 01 14:20:39 compute-0 kernel: tap371e2b01-c6 (unregistering): left promiscuous mode
Oct 01 14:20:39 compute-0 NetworkManager[51741]: <info>  [1759328439.4835] device (tap371e2b01-c6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 01 14:20:39 compute-0 nova_compute[192698]: 2025-10-01 14:20:39.492 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:20:39 compute-0 ovn_controller[94909]: 2025-10-01T14:20:39Z|00159|binding|INFO|Releasing lport 371e2b01-c6b3-4eb6-ae51-19962f1315ef from this chassis (sb_readonly=0)
Oct 01 14:20:39 compute-0 ovn_controller[94909]: 2025-10-01T14:20:39Z|00160|binding|INFO|Setting lport 371e2b01-c6b3-4eb6-ae51-19962f1315ef down in Southbound
Oct 01 14:20:39 compute-0 ovn_controller[94909]: 2025-10-01T14:20:39Z|00161|binding|INFO|Removing iface tap371e2b01-c6 ovn-installed in OVS
Oct 01 14:20:39 compute-0 nova_compute[192698]: 2025-10-01 14:20:39.500 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:20:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:20:39.506 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5f:4f:08 10.100.0.4'], port_security=['fa:16:3e:5f:4f:08 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '64f0e219-6df7-4a26-b95b-90f93f33620e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-031a8987-8430-4fb6-a464-01e4dca2fae7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd43115e3729442e1b68b749acc0dabc8', 'neutron:revision_number': '15', 'neutron:security_group_ids': '43a3232d-93b1-43af-a9a3-1fde49b4460d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bd1914da-f1b0-4097-9d6b-24a3870871dc, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], logical_port=371e2b01-c6b3-4eb6-ae51-19962f1315ef) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:20:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:20:39.507 103791 INFO neutron.agent.ovn.metadata.agent [-] Port 371e2b01-c6b3-4eb6-ae51-19962f1315ef in datapath 031a8987-8430-4fb6-a464-01e4dca2fae7 unbound from our chassis
Oct 01 14:20:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:20:39.508 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 031a8987-8430-4fb6-a464-01e4dca2fae7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 01 14:20:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:20:39.510 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[b5e75ed8-4c80-4327-bf63-d7365c88deab]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:20:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:20:39.511 103791 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7 namespace which is not needed anymore
Oct 01 14:20:39 compute-0 nova_compute[192698]: 2025-10-01 14:20:39.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:20:39 compute-0 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000012.scope: Deactivated successfully.
Oct 01 14:20:39 compute-0 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000012.scope: Consumed 3.168s CPU time.
Oct 01 14:20:39 compute-0 systemd-machined[152704]: Machine qemu-14-instance-00000012 terminated.
Oct 01 14:20:39 compute-0 podman[222854]: 2025-10-01 14:20:39.643401832 +0000 UTC m=+0.035027656 container kill 5e6ff6386b445c145f4fa09fd78b67ccbea946bf1856fdca0c305e737a2d24e8 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 01 14:20:39 compute-0 neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7[222444]: [NOTICE]   (222448) : haproxy version is 3.0.5-8e879a5
Oct 01 14:20:39 compute-0 neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7[222444]: [NOTICE]   (222448) : path to executable is /usr/sbin/haproxy
Oct 01 14:20:39 compute-0 neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7[222444]: [WARNING]  (222448) : Exiting Master process...
Oct 01 14:20:39 compute-0 neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7[222444]: [ALERT]    (222448) : Current worker (222450) exited with code 143 (Terminated)
Oct 01 14:20:39 compute-0 neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7[222444]: [WARNING]  (222448) : All workers exited. Exiting... (0)
Oct 01 14:20:39 compute-0 systemd[1]: libpod-5e6ff6386b445c145f4fa09fd78b67ccbea946bf1856fdca0c305e737a2d24e8.scope: Deactivated successfully.
Oct 01 14:20:39 compute-0 podman[222871]: 2025-10-01 14:20:39.707063989 +0000 UTC m=+0.033743761 container died 5e6ff6386b445c145f4fa09fd78b67ccbea946bf1856fdca0c305e737a2d24e8 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Oct 01 14:20:39 compute-0 nova_compute[192698]: 2025-10-01 14:20:39.730 2 INFO nova.virt.libvirt.driver [-] [instance: 64f0e219-6df7-4a26-b95b-90f93f33620e] Instance destroyed successfully.
Oct 01 14:20:39 compute-0 nova_compute[192698]: 2025-10-01 14:20:39.730 2 DEBUG nova.objects.instance [None req-673f88ac-92e1-46f2-9d4f-e7f49d03a658 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lazy-loading 'resources' on Instance uuid 64f0e219-6df7-4a26-b95b-90f93f33620e obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:20:39 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5e6ff6386b445c145f4fa09fd78b67ccbea946bf1856fdca0c305e737a2d24e8-userdata-shm.mount: Deactivated successfully.
Oct 01 14:20:39 compute-0 systemd[1]: var-lib-containers-storage-overlay-e36be47be6c5d6e98532476ff23ba4a4cd1efed9460afc15aacbc23c4b99fba5-merged.mount: Deactivated successfully.
Oct 01 14:20:39 compute-0 podman[222871]: 2025-10-01 14:20:39.771494217 +0000 UTC m=+0.098173969 container remove 5e6ff6386b445c145f4fa09fd78b67ccbea946bf1856fdca0c305e737a2d24e8 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4)
Oct 01 14:20:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:20:39.777 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[eb6ffcec-3295-4416-a5d3-0cbbca2d19b5]: (4, ("Wed Oct  1 02:20:39 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7 (5e6ff6386b445c145f4fa09fd78b67ccbea946bf1856fdca0c305e737a2d24e8)\n5e6ff6386b445c145f4fa09fd78b67ccbea946bf1856fdca0c305e737a2d24e8\nWed Oct  1 02:20:39 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7 (5e6ff6386b445c145f4fa09fd78b67ccbea946bf1856fdca0c305e737a2d24e8)\n5e6ff6386b445c145f4fa09fd78b67ccbea946bf1856fdca0c305e737a2d24e8\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:20:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:20:39.779 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[4cb2b350-9826-44a9-86b0-4b08e275574d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:20:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:20:39.779 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:20:39 compute-0 systemd[1]: libpod-conmon-5e6ff6386b445c145f4fa09fd78b67ccbea946bf1856fdca0c305e737a2d24e8.scope: Deactivated successfully.
Oct 01 14:20:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:20:39.780 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[f26d8a6e-0dc1-4b1a-98b3-b0ac031fb877]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:20:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:20:39.781 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap031a8987-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:20:39 compute-0 nova_compute[192698]: 2025-10-01 14:20:39.783 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:20:39 compute-0 kernel: tap031a8987-80: left promiscuous mode
Oct 01 14:20:39 compute-0 nova_compute[192698]: 2025-10-01 14:20:39.801 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:20:39 compute-0 nova_compute[192698]: 2025-10-01 14:20:39.803 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:20:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:20:39.806 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[74baf991-cdae-4e35-8988-ca4d03cf23f4]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:20:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:20:39.846 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[6328990c-14c0-4637-836e-09b017309d7c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:20:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:20:39.847 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[023ccbef-4cd0-458f-adf7-226a88b8006c]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:20:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:20:39.876 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[1ad23c81-ed74-4700-8d96-855407bf5e08]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 474192, 'reachable_time': 41675, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222921, 'error': None, 'target': 'ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:20:39 compute-0 systemd[1]: run-netns-ovnmeta\x2d031a8987\x2d8430\x2d4fb6\x2da464\x2d01e4dca2fae7.mount: Deactivated successfully.
Oct 01 14:20:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:20:39.884 103910 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Oct 01 14:20:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:20:39.885 103910 DEBUG oslo.privsep.daemon [-] privsep: reply[e68d851d-2e32-41e0-8a28-a8317e7e52fb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:20:40 compute-0 nova_compute[192698]: 2025-10-01 14:20:40.239 2 DEBUG nova.virt.libvirt.vif [None req-673f88ac-92e1-46f2-9d4f-e7f49d03a658 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-10-01T14:19:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-2078744306',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-2078744306',id=18,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-01T14:19:21Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d43115e3729442e1b68b749acc0dabc8',ramdisk_id='',reservation_id='r-ryv8hzlj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',clean_attempts='1',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-30131345',owner_user_name='tempest-TestExecuteStrategies-30131345-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-01T14:20:22Z,user_data=None,user_id='f8897741e6ca4770b56d28d05fa3fc42',uuid=64f0e219-6df7-4a26-b95b-90f93f33620e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "371e2b01-c6b3-4eb6-ae51-19962f1315ef", "address": "fa:16:3e:5f:4f:08", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap371e2b01-c6", "ovs_interfaceid": "371e2b01-c6b3-4eb6-ae51-19962f1315ef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 01 14:20:40 compute-0 nova_compute[192698]: 2025-10-01 14:20:40.240 2 DEBUG nova.network.os_vif_util [None req-673f88ac-92e1-46f2-9d4f-e7f49d03a658 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Converting VIF {"id": "371e2b01-c6b3-4eb6-ae51-19962f1315ef", "address": "fa:16:3e:5f:4f:08", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap371e2b01-c6", "ovs_interfaceid": "371e2b01-c6b3-4eb6-ae51-19962f1315ef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:20:40 compute-0 nova_compute[192698]: 2025-10-01 14:20:40.241 2 DEBUG nova.network.os_vif_util [None req-673f88ac-92e1-46f2-9d4f-e7f49d03a658 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5f:4f:08,bridge_name='br-int',has_traffic_filtering=True,id=371e2b01-c6b3-4eb6-ae51-19962f1315ef,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap371e2b01-c6') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:20:40 compute-0 nova_compute[192698]: 2025-10-01 14:20:40.242 2 DEBUG os_vif [None req-673f88ac-92e1-46f2-9d4f-e7f49d03a658 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:5f:4f:08,bridge_name='br-int',has_traffic_filtering=True,id=371e2b01-c6b3-4eb6-ae51-19962f1315ef,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap371e2b01-c6') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 01 14:20:40 compute-0 nova_compute[192698]: 2025-10-01 14:20:40.244 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:20:40 compute-0 nova_compute[192698]: 2025-10-01 14:20:40.245 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap371e2b01-c6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:20:40 compute-0 nova_compute[192698]: 2025-10-01 14:20:40.247 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:20:40 compute-0 nova_compute[192698]: 2025-10-01 14:20:40.250 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:20:40 compute-0 nova_compute[192698]: 2025-10-01 14:20:40.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:20:40 compute-0 nova_compute[192698]: 2025-10-01 14:20:40.251 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=b9cf8d9e-b27d-42be-b010-c5aa6241ef84) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:20:40 compute-0 nova_compute[192698]: 2025-10-01 14:20:40.252 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:20:40 compute-0 nova_compute[192698]: 2025-10-01 14:20:40.252 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:20:40 compute-0 nova_compute[192698]: 2025-10-01 14:20:40.256 2 INFO os_vif [None req-673f88ac-92e1-46f2-9d4f-e7f49d03a658 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:5f:4f:08,bridge_name='br-int',has_traffic_filtering=True,id=371e2b01-c6b3-4eb6-ae51-19962f1315ef,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap371e2b01-c6')
Oct 01 14:20:40 compute-0 nova_compute[192698]: 2025-10-01 14:20:40.257 2 INFO nova.virt.libvirt.driver [None req-673f88ac-92e1-46f2-9d4f-e7f49d03a658 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 64f0e219-6df7-4a26-b95b-90f93f33620e] Deleting instance files /var/lib/nova/instances/64f0e219-6df7-4a26-b95b-90f93f33620e_del
Oct 01 14:20:40 compute-0 nova_compute[192698]: 2025-10-01 14:20:40.258 2 INFO nova.virt.libvirt.driver [None req-673f88ac-92e1-46f2-9d4f-e7f49d03a658 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 64f0e219-6df7-4a26-b95b-90f93f33620e] Deletion of /var/lib/nova/instances/64f0e219-6df7-4a26-b95b-90f93f33620e_del complete
Oct 01 14:20:40 compute-0 nova_compute[192698]: 2025-10-01 14:20:40.408 2 DEBUG nova.compute.manager [req-c803c4ae-c3df-4911-97a6-1d0d9caa5254 req-77d40695-3c95-4af7-82b4-0d3bfadb7197 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 64f0e219-6df7-4a26-b95b-90f93f33620e] Received event network-vif-unplugged-371e2b01-c6b3-4eb6-ae51-19962f1315ef external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:20:40 compute-0 nova_compute[192698]: 2025-10-01 14:20:40.409 2 DEBUG oslo_concurrency.lockutils [req-c803c4ae-c3df-4911-97a6-1d0d9caa5254 req-77d40695-3c95-4af7-82b4-0d3bfadb7197 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "64f0e219-6df7-4a26-b95b-90f93f33620e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:20:40 compute-0 nova_compute[192698]: 2025-10-01 14:20:40.410 2 DEBUG oslo_concurrency.lockutils [req-c803c4ae-c3df-4911-97a6-1d0d9caa5254 req-77d40695-3c95-4af7-82b4-0d3bfadb7197 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "64f0e219-6df7-4a26-b95b-90f93f33620e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:20:40 compute-0 nova_compute[192698]: 2025-10-01 14:20:40.410 2 DEBUG oslo_concurrency.lockutils [req-c803c4ae-c3df-4911-97a6-1d0d9caa5254 req-77d40695-3c95-4af7-82b4-0d3bfadb7197 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "64f0e219-6df7-4a26-b95b-90f93f33620e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:20:40 compute-0 nova_compute[192698]: 2025-10-01 14:20:40.411 2 DEBUG nova.compute.manager [req-c803c4ae-c3df-4911-97a6-1d0d9caa5254 req-77d40695-3c95-4af7-82b4-0d3bfadb7197 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 64f0e219-6df7-4a26-b95b-90f93f33620e] No waiting events found dispatching network-vif-unplugged-371e2b01-c6b3-4eb6-ae51-19962f1315ef pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:20:40 compute-0 nova_compute[192698]: 2025-10-01 14:20:40.411 2 DEBUG nova.compute.manager [req-c803c4ae-c3df-4911-97a6-1d0d9caa5254 req-77d40695-3c95-4af7-82b4-0d3bfadb7197 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 64f0e219-6df7-4a26-b95b-90f93f33620e] Received event network-vif-unplugged-371e2b01-c6b3-4eb6-ae51-19962f1315ef for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 01 14:20:40 compute-0 nova_compute[192698]: 2025-10-01 14:20:40.775 2 INFO nova.compute.manager [None req-673f88ac-92e1-46f2-9d4f-e7f49d03a658 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 64f0e219-6df7-4a26-b95b-90f93f33620e] Took 1.32 seconds to destroy the instance on the hypervisor.
Oct 01 14:20:40 compute-0 nova_compute[192698]: 2025-10-01 14:20:40.776 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-673f88ac-92e1-46f2-9d4f-e7f49d03a658 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 01 14:20:40 compute-0 nova_compute[192698]: 2025-10-01 14:20:40.776 2 DEBUG nova.compute.manager [-] [instance: 64f0e219-6df7-4a26-b95b-90f93f33620e] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 01 14:20:40 compute-0 nova_compute[192698]: 2025-10-01 14:20:40.777 2 DEBUG nova.network.neutron [-] [instance: 64f0e219-6df7-4a26-b95b-90f93f33620e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 01 14:20:40 compute-0 nova_compute[192698]: 2025-10-01 14:20:40.777 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:20:40 compute-0 nova_compute[192698]: 2025-10-01 14:20:40.856 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:20:41 compute-0 nova_compute[192698]: 2025-10-01 14:20:41.361 2 DEBUG nova.compute.manager [req-6c78be42-6456-42d1-b21e-0200361ca5ec req-877df59f-e54e-4a87-bc65-372c3f4466b0 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 64f0e219-6df7-4a26-b95b-90f93f33620e] Received event network-vif-deleted-371e2b01-c6b3-4eb6-ae51-19962f1315ef external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:20:41 compute-0 nova_compute[192698]: 2025-10-01 14:20:41.362 2 INFO nova.compute.manager [req-6c78be42-6456-42d1-b21e-0200361ca5ec req-877df59f-e54e-4a87-bc65-372c3f4466b0 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 64f0e219-6df7-4a26-b95b-90f93f33620e] Neutron deleted interface 371e2b01-c6b3-4eb6-ae51-19962f1315ef; detaching it from the instance and deleting it from the info cache
Oct 01 14:20:41 compute-0 nova_compute[192698]: 2025-10-01 14:20:41.362 2 DEBUG nova.network.neutron [req-6c78be42-6456-42d1-b21e-0200361ca5ec req-877df59f-e54e-4a87-bc65-372c3f4466b0 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 64f0e219-6df7-4a26-b95b-90f93f33620e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:20:41 compute-0 nova_compute[192698]: 2025-10-01 14:20:41.654 2 DEBUG nova.network.neutron [-] [instance: 64f0e219-6df7-4a26-b95b-90f93f33620e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:20:41 compute-0 nova_compute[192698]: 2025-10-01 14:20:41.871 2 DEBUG nova.compute.manager [req-6c78be42-6456-42d1-b21e-0200361ca5ec req-877df59f-e54e-4a87-bc65-372c3f4466b0 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 64f0e219-6df7-4a26-b95b-90f93f33620e] Detach interface failed, port_id=371e2b01-c6b3-4eb6-ae51-19962f1315ef, reason: Instance 64f0e219-6df7-4a26-b95b-90f93f33620e could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Oct 01 14:20:42 compute-0 nova_compute[192698]: 2025-10-01 14:20:42.162 2 INFO nova.compute.manager [-] [instance: 64f0e219-6df7-4a26-b95b-90f93f33620e] Took 1.39 seconds to deallocate network for instance.
Oct 01 14:20:42 compute-0 podman[222923]: 2025-10-01 14:20:42.197973468 +0000 UTC m=+0.096902664 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 01 14:20:42 compute-0 podman[222924]: 2025-10-01 14:20:42.22288304 +0000 UTC m=+0.124930790 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 01 14:20:42 compute-0 nova_compute[192698]: 2025-10-01 14:20:42.471 2 DEBUG nova.compute.manager [req-ebb0c71c-088a-4633-9db8-4d40611a8db9 req-cea2c0b3-856b-483f-bd4d-6cba03acbe72 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 64f0e219-6df7-4a26-b95b-90f93f33620e] Received event network-vif-unplugged-371e2b01-c6b3-4eb6-ae51-19962f1315ef external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:20:42 compute-0 nova_compute[192698]: 2025-10-01 14:20:42.471 2 DEBUG oslo_concurrency.lockutils [req-ebb0c71c-088a-4633-9db8-4d40611a8db9 req-cea2c0b3-856b-483f-bd4d-6cba03acbe72 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "64f0e219-6df7-4a26-b95b-90f93f33620e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:20:42 compute-0 nova_compute[192698]: 2025-10-01 14:20:42.472 2 DEBUG oslo_concurrency.lockutils [req-ebb0c71c-088a-4633-9db8-4d40611a8db9 req-cea2c0b3-856b-483f-bd4d-6cba03acbe72 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "64f0e219-6df7-4a26-b95b-90f93f33620e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:20:42 compute-0 nova_compute[192698]: 2025-10-01 14:20:42.472 2 DEBUG oslo_concurrency.lockutils [req-ebb0c71c-088a-4633-9db8-4d40611a8db9 req-cea2c0b3-856b-483f-bd4d-6cba03acbe72 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "64f0e219-6df7-4a26-b95b-90f93f33620e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:20:42 compute-0 nova_compute[192698]: 2025-10-01 14:20:42.473 2 DEBUG nova.compute.manager [req-ebb0c71c-088a-4633-9db8-4d40611a8db9 req-cea2c0b3-856b-483f-bd4d-6cba03acbe72 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 64f0e219-6df7-4a26-b95b-90f93f33620e] No waiting events found dispatching network-vif-unplugged-371e2b01-c6b3-4eb6-ae51-19962f1315ef pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:20:42 compute-0 nova_compute[192698]: 2025-10-01 14:20:42.473 2 WARNING nova.compute.manager [req-ebb0c71c-088a-4633-9db8-4d40611a8db9 req-cea2c0b3-856b-483f-bd4d-6cba03acbe72 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 64f0e219-6df7-4a26-b95b-90f93f33620e] Received unexpected event network-vif-unplugged-371e2b01-c6b3-4eb6-ae51-19962f1315ef for instance with vm_state deleted and task_state None.
Oct 01 14:20:42 compute-0 nova_compute[192698]: 2025-10-01 14:20:42.689 2 DEBUG oslo_concurrency.lockutils [None req-673f88ac-92e1-46f2-9d4f-e7f49d03a658 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:20:42 compute-0 nova_compute[192698]: 2025-10-01 14:20:42.689 2 DEBUG oslo_concurrency.lockutils [None req-673f88ac-92e1-46f2-9d4f-e7f49d03a658 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:20:42 compute-0 nova_compute[192698]: 2025-10-01 14:20:42.695 2 DEBUG oslo_concurrency.lockutils [None req-673f88ac-92e1-46f2-9d4f-e7f49d03a658 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.006s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:20:42 compute-0 nova_compute[192698]: 2025-10-01 14:20:42.720 2 INFO nova.scheduler.client.report [None req-673f88ac-92e1-46f2-9d4f-e7f49d03a658 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Deleted allocations for instance 64f0e219-6df7-4a26-b95b-90f93f33620e
Oct 01 14:20:43 compute-0 nova_compute[192698]: 2025-10-01 14:20:43.754 2 DEBUG oslo_concurrency.lockutils [None req-673f88ac-92e1-46f2-9d4f-e7f49d03a658 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "64f0e219-6df7-4a26-b95b-90f93f33620e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.828s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:20:43 compute-0 nova_compute[192698]: 2025-10-01 14:20:43.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:20:45 compute-0 nova_compute[192698]: 2025-10-01 14:20:45.253 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:20:48 compute-0 nova_compute[192698]: 2025-10-01 14:20:48.987 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:20:49 compute-0 podman[222968]: 2025-10-01 14:20:49.184644035 +0000 UTC m=+0.095933748 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, release=1755695350, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, managed_by=edpm_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct 01 14:20:50 compute-0 nova_compute[192698]: 2025-10-01 14:20:50.256 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:20:53 compute-0 nova_compute[192698]: 2025-10-01 14:20:53.990 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:20:55 compute-0 podman[222990]: 2025-10-01 14:20:55.177441916 +0000 UTC m=+0.086653338 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0)
Oct 01 14:20:55 compute-0 podman[222989]: 2025-10-01 14:20:55.175240656 +0000 UTC m=+0.088994531 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Oct 01 14:20:55 compute-0 nova_compute[192698]: 2025-10-01 14:20:55.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:20:58 compute-0 nova_compute[192698]: 2025-10-01 14:20:58.992 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:20:59 compute-0 podman[203144]: time="2025-10-01T14:20:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:20:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:20:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:20:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:20:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3020 "" "Go-http-client/1.1"
Oct 01 14:21:00 compute-0 nova_compute[192698]: 2025-10-01 14:21:00.309 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:21:01 compute-0 podman[223028]: 2025-10-01 14:21:01.141098902 +0000 UTC m=+0.064454919 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 01 14:21:01 compute-0 openstack_network_exporter[205307]: ERROR   14:21:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:21:01 compute-0 openstack_network_exporter[205307]: ERROR   14:21:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:21:01 compute-0 openstack_network_exporter[205307]: ERROR   14:21:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:21:01 compute-0 openstack_network_exporter[205307]: ERROR   14:21:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:21:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:21:01 compute-0 openstack_network_exporter[205307]: ERROR   14:21:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:21:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:21:03 compute-0 nova_compute[192698]: 2025-10-01 14:21:03.994 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:21:05 compute-0 nova_compute[192698]: 2025-10-01 14:21:05.311 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:21:09 compute-0 nova_compute[192698]: 2025-10-01 14:21:08.999 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:21:10 compute-0 nova_compute[192698]: 2025-10-01 14:21:10.313 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:21:13 compute-0 podman[223053]: 2025-10-01 14:21:13.13895872 +0000 UTC m=+0.050207376 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 01 14:21:13 compute-0 podman[223054]: 2025-10-01 14:21:13.181185548 +0000 UTC m=+0.087188682 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20250930)
Oct 01 14:21:14 compute-0 nova_compute[192698]: 2025-10-01 14:21:14.000 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:21:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:21:14.276 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:21:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:21:14.276 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:21:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:21:14.276 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:21:15 compute-0 nova_compute[192698]: 2025-10-01 14:21:15.315 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:21:19 compute-0 nova_compute[192698]: 2025-10-01 14:21:19.036 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:21:19 compute-0 nova_compute[192698]: 2025-10-01 14:21:19.924 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:21:19 compute-0 nova_compute[192698]: 2025-10-01 14:21:19.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:21:20 compute-0 podman[223100]: 2025-10-01 14:21:20.179292873 +0000 UTC m=+0.086848873 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, maintainer=Red Hat, Inc., name=ubi9-minimal, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, release=1755695350)
Oct 01 14:21:20 compute-0 nova_compute[192698]: 2025-10-01 14:21:20.317 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:21:20 compute-0 nova_compute[192698]: 2025-10-01 14:21:20.440 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:21:20 compute-0 nova_compute[192698]: 2025-10-01 14:21:20.440 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:21:20 compute-0 nova_compute[192698]: 2025-10-01 14:21:20.441 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:21:20 compute-0 nova_compute[192698]: 2025-10-01 14:21:20.441 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 01 14:21:20 compute-0 nova_compute[192698]: 2025-10-01 14:21:20.656 2 WARNING nova.virt.libvirt.driver [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:21:20 compute-0 nova_compute[192698]: 2025-10-01 14:21:20.657 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:21:20 compute-0 nova_compute[192698]: 2025-10-01 14:21:20.685 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.028s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:21:20 compute-0 nova_compute[192698]: 2025-10-01 14:21:20.689 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5853MB free_disk=73.30294036865234GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 01 14:21:20 compute-0 nova_compute[192698]: 2025-10-01 14:21:20.689 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:21:20 compute-0 nova_compute[192698]: 2025-10-01 14:21:20.690 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:21:21 compute-0 nova_compute[192698]: 2025-10-01 14:21:21.575 2 DEBUG oslo_concurrency.lockutils [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Acquiring lock "57466778-4bb3-4165-9a9f-bfca9f200d03" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:21:21 compute-0 nova_compute[192698]: 2025-10-01 14:21:21.576 2 DEBUG oslo_concurrency.lockutils [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "57466778-4bb3-4165-9a9f-bfca9f200d03" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:21:22 compute-0 nova_compute[192698]: 2025-10-01 14:21:22.082 2 DEBUG nova.compute.manager [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Oct 01 14:21:22 compute-0 nova_compute[192698]: 2025-10-01 14:21:22.256 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Instance 57466778-4bb3-4165-9a9f-bfca9f200d03 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1797
Oct 01 14:21:22 compute-0 nova_compute[192698]: 2025-10-01 14:21:22.257 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 01 14:21:22 compute-0 nova_compute[192698]: 2025-10-01 14:21:22.257 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:21:20 up  1:20,  0 user,  load average: 0.15, 0.21, 0.31\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 01 14:21:22 compute-0 nova_compute[192698]: 2025-10-01 14:21:22.306 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:21:22 compute-0 nova_compute[192698]: 2025-10-01 14:21:22.629 2 DEBUG oslo_concurrency.lockutils [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:21:22 compute-0 nova_compute[192698]: 2025-10-01 14:21:22.820 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:21:23 compute-0 nova_compute[192698]: 2025-10-01 14:21:23.332 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 01 14:21:23 compute-0 nova_compute[192698]: 2025-10-01 14:21:23.333 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.643s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:21:23 compute-0 nova_compute[192698]: 2025-10-01 14:21:23.333 2 DEBUG oslo_concurrency.lockutils [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.704s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:21:23 compute-0 nova_compute[192698]: 2025-10-01 14:21:23.343 2 DEBUG nova.virt.hardware [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Oct 01 14:21:23 compute-0 nova_compute[192698]: 2025-10-01 14:21:23.343 2 INFO nova.compute.claims [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] Claim successful on node compute-0.ctlplane.example.com
Oct 01 14:21:24 compute-0 nova_compute[192698]: 2025-10-01 14:21:24.042 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:21:24 compute-0 nova_compute[192698]: 2025-10-01 14:21:24.401 2 DEBUG nova.compute.provider_tree [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:21:24 compute-0 nova_compute[192698]: 2025-10-01 14:21:24.910 2 DEBUG nova.scheduler.client.report [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:21:25 compute-0 nova_compute[192698]: 2025-10-01 14:21:25.320 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:21:25 compute-0 nova_compute[192698]: 2025-10-01 14:21:25.323 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:21:25 compute-0 nova_compute[192698]: 2025-10-01 14:21:25.452 2 DEBUG oslo_concurrency.lockutils [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.119s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:21:25 compute-0 nova_compute[192698]: 2025-10-01 14:21:25.453 2 DEBUG nova.compute.manager [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Oct 01 14:21:25 compute-0 nova_compute[192698]: 2025-10-01 14:21:25.832 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:21:25 compute-0 nova_compute[192698]: 2025-10-01 14:21:25.833 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:21:25 compute-0 nova_compute[192698]: 2025-10-01 14:21:25.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:21:25 compute-0 nova_compute[192698]: 2025-10-01 14:21:25.967 2 DEBUG nova.compute.manager [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Oct 01 14:21:25 compute-0 nova_compute[192698]: 2025-10-01 14:21:25.968 2 DEBUG nova.network.neutron [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Oct 01 14:21:25 compute-0 nova_compute[192698]: 2025-10-01 14:21:25.968 2 WARNING neutronclient.v2_0.client [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:21:25 compute-0 nova_compute[192698]: 2025-10-01 14:21:25.969 2 WARNING neutronclient.v2_0.client [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:21:26 compute-0 podman[223123]: 2025-10-01 14:21:26.180360477 +0000 UTC m=+0.089561806 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930)
Oct 01 14:21:26 compute-0 podman[223124]: 2025-10-01 14:21:26.217365165 +0000 UTC m=+0.116423750 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_managed=true, config_id=multipathd)
Oct 01 14:21:26 compute-0 nova_compute[192698]: 2025-10-01 14:21:26.479 2 INFO nova.virt.libvirt.driver [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 01 14:21:26 compute-0 nova_compute[192698]: 2025-10-01 14:21:26.914 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:21:26 compute-0 nova_compute[192698]: 2025-10-01 14:21:26.989 2 DEBUG nova.compute.manager [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Oct 01 14:21:27 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:21:27.396 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'e2:3f:3c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:1d:a6:67:ed:e6'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:21:27 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:21:27.397 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 01 14:21:27 compute-0 nova_compute[192698]: 2025-10-01 14:21:27.399 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:21:28 compute-0 nova_compute[192698]: 2025-10-01 14:21:28.014 2 DEBUG nova.compute.manager [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Oct 01 14:21:28 compute-0 nova_compute[192698]: 2025-10-01 14:21:28.016 2 DEBUG nova.virt.libvirt.driver [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Oct 01 14:21:28 compute-0 nova_compute[192698]: 2025-10-01 14:21:28.016 2 INFO nova.virt.libvirt.driver [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] Creating image(s)
Oct 01 14:21:28 compute-0 nova_compute[192698]: 2025-10-01 14:21:28.017 2 DEBUG oslo_concurrency.lockutils [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Acquiring lock "/var/lib/nova/instances/57466778-4bb3-4165-9a9f-bfca9f200d03/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:21:28 compute-0 nova_compute[192698]: 2025-10-01 14:21:28.018 2 DEBUG oslo_concurrency.lockutils [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "/var/lib/nova/instances/57466778-4bb3-4165-9a9f-bfca9f200d03/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:21:28 compute-0 nova_compute[192698]: 2025-10-01 14:21:28.019 2 DEBUG oslo_concurrency.lockutils [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "/var/lib/nova/instances/57466778-4bb3-4165-9a9f-bfca9f200d03/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:21:28 compute-0 nova_compute[192698]: 2025-10-01 14:21:28.020 2 DEBUG oslo_utils.imageutils.format_inspector [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:21:28 compute-0 nova_compute[192698]: 2025-10-01 14:21:28.026 2 DEBUG oslo_utils.imageutils.format_inspector [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:21:28 compute-0 nova_compute[192698]: 2025-10-01 14:21:28.028 2 DEBUG oslo_concurrency.processutils [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:21:28 compute-0 nova_compute[192698]: 2025-10-01 14:21:28.089 2 DEBUG oslo_concurrency.processutils [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:21:28 compute-0 nova_compute[192698]: 2025-10-01 14:21:28.090 2 DEBUG oslo_concurrency.lockutils [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Acquiring lock "f477473ce09fdc00484ca839f539813eb2fee546" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:21:28 compute-0 nova_compute[192698]: 2025-10-01 14:21:28.091 2 DEBUG oslo_concurrency.lockutils [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "f477473ce09fdc00484ca839f539813eb2fee546" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:21:28 compute-0 nova_compute[192698]: 2025-10-01 14:21:28.092 2 DEBUG oslo_utils.imageutils.format_inspector [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:21:28 compute-0 nova_compute[192698]: 2025-10-01 14:21:28.098 2 DEBUG oslo_utils.imageutils.format_inspector [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:21:28 compute-0 nova_compute[192698]: 2025-10-01 14:21:28.099 2 DEBUG oslo_concurrency.processutils [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:21:28 compute-0 nova_compute[192698]: 2025-10-01 14:21:28.164 2 DEBUG oslo_concurrency.processutils [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:21:28 compute-0 nova_compute[192698]: 2025-10-01 14:21:28.166 2 DEBUG oslo_concurrency.processutils [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546,backing_fmt=raw /var/lib/nova/instances/57466778-4bb3-4165-9a9f-bfca9f200d03/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:21:28 compute-0 nova_compute[192698]: 2025-10-01 14:21:28.214 2 DEBUG oslo_concurrency.processutils [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546,backing_fmt=raw /var/lib/nova/instances/57466778-4bb3-4165-9a9f-bfca9f200d03/disk 1073741824" returned: 0 in 0.048s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:21:28 compute-0 nova_compute[192698]: 2025-10-01 14:21:28.216 2 DEBUG oslo_concurrency.lockutils [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "f477473ce09fdc00484ca839f539813eb2fee546" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.124s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:21:28 compute-0 nova_compute[192698]: 2025-10-01 14:21:28.217 2 DEBUG oslo_concurrency.processutils [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:21:28 compute-0 nova_compute[192698]: 2025-10-01 14:21:28.292 2 DEBUG oslo_concurrency.processutils [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:21:28 compute-0 nova_compute[192698]: 2025-10-01 14:21:28.294 2 DEBUG nova.virt.disk.api [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Checking if we can resize image /var/lib/nova/instances/57466778-4bb3-4165-9a9f-bfca9f200d03/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 01 14:21:28 compute-0 nova_compute[192698]: 2025-10-01 14:21:28.294 2 DEBUG oslo_concurrency.processutils [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/57466778-4bb3-4165-9a9f-bfca9f200d03/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:21:28 compute-0 nova_compute[192698]: 2025-10-01 14:21:28.384 2 DEBUG oslo_concurrency.processutils [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/57466778-4bb3-4165-9a9f-bfca9f200d03/disk --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:21:28 compute-0 nova_compute[192698]: 2025-10-01 14:21:28.386 2 DEBUG nova.virt.disk.api [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Cannot resize image /var/lib/nova/instances/57466778-4bb3-4165-9a9f-bfca9f200d03/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 01 14:21:28 compute-0 nova_compute[192698]: 2025-10-01 14:21:28.387 2 DEBUG nova.virt.libvirt.driver [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Oct 01 14:21:28 compute-0 nova_compute[192698]: 2025-10-01 14:21:28.387 2 DEBUG nova.virt.libvirt.driver [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] Ensure instance console log exists: /var/lib/nova/instances/57466778-4bb3-4165-9a9f-bfca9f200d03/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Oct 01 14:21:28 compute-0 nova_compute[192698]: 2025-10-01 14:21:28.388 2 DEBUG oslo_concurrency.lockutils [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:21:28 compute-0 nova_compute[192698]: 2025-10-01 14:21:28.389 2 DEBUG oslo_concurrency.lockutils [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:21:28 compute-0 nova_compute[192698]: 2025-10-01 14:21:28.389 2 DEBUG oslo_concurrency.lockutils [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:21:29 compute-0 nova_compute[192698]: 2025-10-01 14:21:29.044 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:21:29 compute-0 nova_compute[192698]: 2025-10-01 14:21:29.356 2 DEBUG nova.network.neutron [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] Successfully created port: f7e77fe8-6f1f-498f-be00-420b8e788b70 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Oct 01 14:21:29 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:21:29.399 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=10cf9814-09fa-4bad-879a-270f9b64eda3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:21:29 compute-0 podman[203144]: time="2025-10-01T14:21:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:21:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:21:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:21:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:21:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3025 "" "Go-http-client/1.1"
Oct 01 14:21:30 compute-0 nova_compute[192698]: 2025-10-01 14:21:30.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:21:30 compute-0 nova_compute[192698]: 2025-10-01 14:21:30.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:21:31 compute-0 nova_compute[192698]: 2025-10-01 14:21:31.068 2 DEBUG nova.network.neutron [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] Successfully updated port: f7e77fe8-6f1f-498f-be00-420b8e788b70 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Oct 01 14:21:31 compute-0 nova_compute[192698]: 2025-10-01 14:21:31.125 2 DEBUG nova.compute.manager [req-67f01867-dc7b-4822-b550-d90e8d0a90f6 req-ab81c27b-7ce3-4b0c-9edb-ce6d73f62bc5 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] Received event network-changed-f7e77fe8-6f1f-498f-be00-420b8e788b70 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:21:31 compute-0 nova_compute[192698]: 2025-10-01 14:21:31.126 2 DEBUG nova.compute.manager [req-67f01867-dc7b-4822-b550-d90e8d0a90f6 req-ab81c27b-7ce3-4b0c-9edb-ce6d73f62bc5 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] Refreshing instance network info cache due to event network-changed-f7e77fe8-6f1f-498f-be00-420b8e788b70. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Oct 01 14:21:31 compute-0 nova_compute[192698]: 2025-10-01 14:21:31.126 2 DEBUG oslo_concurrency.lockutils [req-67f01867-dc7b-4822-b550-d90e8d0a90f6 req-ab81c27b-7ce3-4b0c-9edb-ce6d73f62bc5 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "refresh_cache-57466778-4bb3-4165-9a9f-bfca9f200d03" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 01 14:21:31 compute-0 nova_compute[192698]: 2025-10-01 14:21:31.127 2 DEBUG oslo_concurrency.lockutils [req-67f01867-dc7b-4822-b550-d90e8d0a90f6 req-ab81c27b-7ce3-4b0c-9edb-ce6d73f62bc5 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquired lock "refresh_cache-57466778-4bb3-4165-9a9f-bfca9f200d03" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 01 14:21:31 compute-0 nova_compute[192698]: 2025-10-01 14:21:31.127 2 DEBUG nova.network.neutron [req-67f01867-dc7b-4822-b550-d90e8d0a90f6 req-ab81c27b-7ce3-4b0c-9edb-ce6d73f62bc5 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] Refreshing network info cache for port f7e77fe8-6f1f-498f-be00-420b8e788b70 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Oct 01 14:21:31 compute-0 openstack_network_exporter[205307]: ERROR   14:21:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:21:31 compute-0 openstack_network_exporter[205307]: ERROR   14:21:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:21:31 compute-0 openstack_network_exporter[205307]: ERROR   14:21:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:21:31 compute-0 openstack_network_exporter[205307]: ERROR   14:21:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:21:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:21:31 compute-0 openstack_network_exporter[205307]: ERROR   14:21:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:21:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:21:31 compute-0 nova_compute[192698]: 2025-10-01 14:21:31.575 2 DEBUG oslo_concurrency.lockutils [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Acquiring lock "refresh_cache-57466778-4bb3-4165-9a9f-bfca9f200d03" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 01 14:21:31 compute-0 nova_compute[192698]: 2025-10-01 14:21:31.634 2 WARNING neutronclient.v2_0.client [req-67f01867-dc7b-4822-b550-d90e8d0a90f6 req-ab81c27b-7ce3-4b0c-9edb-ce6d73f62bc5 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:21:31 compute-0 nova_compute[192698]: 2025-10-01 14:21:31.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:21:31 compute-0 nova_compute[192698]: 2025-10-01 14:21:31.926 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 01 14:21:32 compute-0 podman[223179]: 2025-10-01 14:21:32.164097956 +0000 UTC m=+0.073847052 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 01 14:21:32 compute-0 nova_compute[192698]: 2025-10-01 14:21:32.361 2 DEBUG nova.network.neutron [req-67f01867-dc7b-4822-b550-d90e8d0a90f6 req-ab81c27b-7ce3-4b0c-9edb-ce6d73f62bc5 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 01 14:21:32 compute-0 nova_compute[192698]: 2025-10-01 14:21:32.537 2 DEBUG nova.network.neutron [req-67f01867-dc7b-4822-b550-d90e8d0a90f6 req-ab81c27b-7ce3-4b0c-9edb-ce6d73f62bc5 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:21:33 compute-0 nova_compute[192698]: 2025-10-01 14:21:33.043 2 DEBUG oslo_concurrency.lockutils [req-67f01867-dc7b-4822-b550-d90e8d0a90f6 req-ab81c27b-7ce3-4b0c-9edb-ce6d73f62bc5 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Releasing lock "refresh_cache-57466778-4bb3-4165-9a9f-bfca9f200d03" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 01 14:21:33 compute-0 nova_compute[192698]: 2025-10-01 14:21:33.045 2 DEBUG oslo_concurrency.lockutils [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Acquired lock "refresh_cache-57466778-4bb3-4165-9a9f-bfca9f200d03" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 01 14:21:33 compute-0 nova_compute[192698]: 2025-10-01 14:21:33.045 2 DEBUG nova.network.neutron [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 01 14:21:34 compute-0 nova_compute[192698]: 2025-10-01 14:21:34.046 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:21:34 compute-0 nova_compute[192698]: 2025-10-01 14:21:34.358 2 DEBUG nova.network.neutron [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 01 14:21:35 compute-0 nova_compute[192698]: 2025-10-01 14:21:35.253 2 WARNING neutronclient.v2_0.client [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:21:35 compute-0 nova_compute[192698]: 2025-10-01 14:21:35.325 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:21:35 compute-0 nova_compute[192698]: 2025-10-01 14:21:35.429 2 DEBUG nova.network.neutron [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] Updating instance_info_cache with network_info: [{"id": "f7e77fe8-6f1f-498f-be00-420b8e788b70", "address": "fa:16:3e:83:bd:d6", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7e77fe8-6f", "ovs_interfaceid": "f7e77fe8-6f1f-498f-be00-420b8e788b70", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:21:35 compute-0 nova_compute[192698]: 2025-10-01 14:21:35.936 2 DEBUG oslo_concurrency.lockutils [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Releasing lock "refresh_cache-57466778-4bb3-4165-9a9f-bfca9f200d03" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 01 14:21:35 compute-0 nova_compute[192698]: 2025-10-01 14:21:35.937 2 DEBUG nova.compute.manager [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] Instance network_info: |[{"id": "f7e77fe8-6f1f-498f-be00-420b8e788b70", "address": "fa:16:3e:83:bd:d6", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7e77fe8-6f", "ovs_interfaceid": "f7e77fe8-6f1f-498f-be00-420b8e788b70", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Oct 01 14:21:35 compute-0 nova_compute[192698]: 2025-10-01 14:21:35.941 2 DEBUG nova.virt.libvirt.driver [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] Start _get_guest_xml network_info=[{"id": "f7e77fe8-6f1f-498f-be00-420b8e788b70", "address": "fa:16:3e:83:bd:d6", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7e77fe8-6f", "ovs_interfaceid": "f7e77fe8-6f1f-498f-be00-420b8e788b70", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-01T13:57:39Z,direct_url=<?>,disk_format='qcow2',id=48696e9b-a20d-4bf6-8ac2-6438fe748ab6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9dacac6049d34f02846f752af09ae16f',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-01T13:57:40Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encrypted': False, 'encryption_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_options': None, 'device_name': '/dev/vda', 'guest_format': None, 'image_id': '48696e9b-a20d-4bf6-8ac2-6438fe748ab6'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Oct 01 14:21:35 compute-0 nova_compute[192698]: 2025-10-01 14:21:35.946 2 WARNING nova.virt.libvirt.driver [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:21:35 compute-0 nova_compute[192698]: 2025-10-01 14:21:35.947 2 DEBUG nova.virt.driver [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='48696e9b-a20d-4bf6-8ac2-6438fe748ab6', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteStrategies-server-678066720', uuid='57466778-4bb3-4165-9a9f-bfca9f200d03'), owner=OwnerMeta(userid='f8897741e6ca4770b56d28d05fa3fc42', username='tempest-TestExecuteStrategies-30131345-project-admin', projectid='d43115e3729442e1b68b749acc0dabc8', projectname='tempest-TestExecuteStrategies-30131345'), image=ImageMeta(id='48696e9b-a20d-4bf6-8ac2-6438fe748ab6', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='69702c4b-38f2-49d1-96d5-85671652c67e', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "f7e77fe8-6f1f-498f-be00-420b8e788b70", "address": "fa:16:3e:83:bd:d6", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7e77fe8-6f", "ovs_interfaceid": "f7e77fe8-6f1f-498f-be00-420b8e788b70", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759328495.947512) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Oct 01 14:21:35 compute-0 nova_compute[192698]: 2025-10-01 14:21:35.951 2 DEBUG nova.virt.libvirt.host [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Oct 01 14:21:35 compute-0 nova_compute[192698]: 2025-10-01 14:21:35.951 2 DEBUG nova.virt.libvirt.host [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Oct 01 14:21:35 compute-0 nova_compute[192698]: 2025-10-01 14:21:35.955 2 DEBUG nova.virt.libvirt.host [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Oct 01 14:21:35 compute-0 nova_compute[192698]: 2025-10-01 14:21:35.955 2 DEBUG nova.virt.libvirt.host [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Oct 01 14:21:35 compute-0 nova_compute[192698]: 2025-10-01 14:21:35.956 2 DEBUG nova.virt.libvirt.driver [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Oct 01 14:21:35 compute-0 nova_compute[192698]: 2025-10-01 14:21:35.956 2 DEBUG nova.virt.hardware [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-01T13:57:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='69702c4b-38f2-49d1-96d5-85671652c67e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-01T13:57:39Z,direct_url=<?>,disk_format='qcow2',id=48696e9b-a20d-4bf6-8ac2-6438fe748ab6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9dacac6049d34f02846f752af09ae16f',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-01T13:57:40Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Oct 01 14:21:35 compute-0 nova_compute[192698]: 2025-10-01 14:21:35.957 2 DEBUG nova.virt.hardware [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Oct 01 14:21:35 compute-0 nova_compute[192698]: 2025-10-01 14:21:35.957 2 DEBUG nova.virt.hardware [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Oct 01 14:21:35 compute-0 nova_compute[192698]: 2025-10-01 14:21:35.957 2 DEBUG nova.virt.hardware [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Oct 01 14:21:35 compute-0 nova_compute[192698]: 2025-10-01 14:21:35.957 2 DEBUG nova.virt.hardware [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Oct 01 14:21:35 compute-0 nova_compute[192698]: 2025-10-01 14:21:35.958 2 DEBUG nova.virt.hardware [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Oct 01 14:21:35 compute-0 nova_compute[192698]: 2025-10-01 14:21:35.958 2 DEBUG nova.virt.hardware [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Oct 01 14:21:35 compute-0 nova_compute[192698]: 2025-10-01 14:21:35.958 2 DEBUG nova.virt.hardware [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Oct 01 14:21:35 compute-0 nova_compute[192698]: 2025-10-01 14:21:35.959 2 DEBUG nova.virt.hardware [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Oct 01 14:21:35 compute-0 nova_compute[192698]: 2025-10-01 14:21:35.959 2 DEBUG nova.virt.hardware [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Oct 01 14:21:35 compute-0 nova_compute[192698]: 2025-10-01 14:21:35.959 2 DEBUG nova.virt.hardware [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Oct 01 14:21:35 compute-0 nova_compute[192698]: 2025-10-01 14:21:35.964 2 DEBUG nova.virt.libvirt.vif [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-01T14:21:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-678066720',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-678066720',id=21,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d43115e3729442e1b68b749acc0dabc8',ramdisk_id='',reservation_id='r-fih772z7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-30131345',owner_user_name='tempest-TestExecuteStrategies-30131345-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-01T14:21:27Z,user_data=None,user_id='f8897741e6ca4770b56d28d05fa3fc42',uuid=57466778-4bb3-4165-9a9f-bfca9f200d03,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f7e77fe8-6f1f-498f-be00-420b8e788b70", "address": "fa:16:3e:83:bd:d6", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7e77fe8-6f", "ovs_interfaceid": "f7e77fe8-6f1f-498f-be00-420b8e788b70", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Oct 01 14:21:35 compute-0 nova_compute[192698]: 2025-10-01 14:21:35.964 2 DEBUG nova.network.os_vif_util [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Converting VIF {"id": "f7e77fe8-6f1f-498f-be00-420b8e788b70", "address": "fa:16:3e:83:bd:d6", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7e77fe8-6f", "ovs_interfaceid": "f7e77fe8-6f1f-498f-be00-420b8e788b70", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:21:35 compute-0 nova_compute[192698]: 2025-10-01 14:21:35.965 2 DEBUG nova.network.os_vif_util [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:83:bd:d6,bridge_name='br-int',has_traffic_filtering=True,id=f7e77fe8-6f1f-498f-be00-420b8e788b70,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7e77fe8-6f') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:21:35 compute-0 nova_compute[192698]: 2025-10-01 14:21:35.966 2 DEBUG nova.objects.instance [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lazy-loading 'pci_devices' on Instance uuid 57466778-4bb3-4165-9a9f-bfca9f200d03 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:21:36 compute-0 nova_compute[192698]: 2025-10-01 14:21:36.477 2 DEBUG nova.virt.libvirt.driver [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] End _get_guest_xml xml=<domain type="kvm">
Oct 01 14:21:36 compute-0 nova_compute[192698]:   <uuid>57466778-4bb3-4165-9a9f-bfca9f200d03</uuid>
Oct 01 14:21:36 compute-0 nova_compute[192698]:   <name>instance-00000015</name>
Oct 01 14:21:36 compute-0 nova_compute[192698]:   <memory>131072</memory>
Oct 01 14:21:36 compute-0 nova_compute[192698]:   <vcpu>1</vcpu>
Oct 01 14:21:36 compute-0 nova_compute[192698]:   <metadata>
Oct 01 14:21:36 compute-0 nova_compute[192698]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 01 14:21:36 compute-0 nova_compute[192698]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Oct 01 14:21:36 compute-0 nova_compute[192698]:       <nova:name>tempest-TestExecuteStrategies-server-678066720</nova:name>
Oct 01 14:21:36 compute-0 nova_compute[192698]:       <nova:creationTime>2025-10-01 14:21:35</nova:creationTime>
Oct 01 14:21:36 compute-0 nova_compute[192698]:       <nova:flavor name="m1.nano" id="69702c4b-38f2-49d1-96d5-85671652c67e">
Oct 01 14:21:36 compute-0 nova_compute[192698]:         <nova:memory>128</nova:memory>
Oct 01 14:21:36 compute-0 nova_compute[192698]:         <nova:disk>1</nova:disk>
Oct 01 14:21:36 compute-0 nova_compute[192698]:         <nova:swap>0</nova:swap>
Oct 01 14:21:36 compute-0 nova_compute[192698]:         <nova:ephemeral>0</nova:ephemeral>
Oct 01 14:21:36 compute-0 nova_compute[192698]:         <nova:vcpus>1</nova:vcpus>
Oct 01 14:21:36 compute-0 nova_compute[192698]:         <nova:extraSpecs>
Oct 01 14:21:36 compute-0 nova_compute[192698]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Oct 01 14:21:36 compute-0 nova_compute[192698]:         </nova:extraSpecs>
Oct 01 14:21:36 compute-0 nova_compute[192698]:       </nova:flavor>
Oct 01 14:21:36 compute-0 nova_compute[192698]:       <nova:image uuid="48696e9b-a20d-4bf6-8ac2-6438fe748ab6">
Oct 01 14:21:36 compute-0 nova_compute[192698]:         <nova:containerFormat>bare</nova:containerFormat>
Oct 01 14:21:36 compute-0 nova_compute[192698]:         <nova:diskFormat>qcow2</nova:diskFormat>
Oct 01 14:21:36 compute-0 nova_compute[192698]:         <nova:minDisk>1</nova:minDisk>
Oct 01 14:21:36 compute-0 nova_compute[192698]:         <nova:minRam>0</nova:minRam>
Oct 01 14:21:36 compute-0 nova_compute[192698]:         <nova:properties>
Oct 01 14:21:36 compute-0 nova_compute[192698]:           <nova:property name="hw_rng_model">virtio</nova:property>
Oct 01 14:21:36 compute-0 nova_compute[192698]:         </nova:properties>
Oct 01 14:21:36 compute-0 nova_compute[192698]:       </nova:image>
Oct 01 14:21:36 compute-0 nova_compute[192698]:       <nova:owner>
Oct 01 14:21:36 compute-0 nova_compute[192698]:         <nova:user uuid="f8897741e6ca4770b56d28d05fa3fc42">tempest-TestExecuteStrategies-30131345-project-admin</nova:user>
Oct 01 14:21:36 compute-0 nova_compute[192698]:         <nova:project uuid="d43115e3729442e1b68b749acc0dabc8">tempest-TestExecuteStrategies-30131345</nova:project>
Oct 01 14:21:36 compute-0 nova_compute[192698]:       </nova:owner>
Oct 01 14:21:36 compute-0 nova_compute[192698]:       <nova:root type="image" uuid="48696e9b-a20d-4bf6-8ac2-6438fe748ab6"/>
Oct 01 14:21:36 compute-0 nova_compute[192698]:       <nova:ports>
Oct 01 14:21:36 compute-0 nova_compute[192698]:         <nova:port uuid="f7e77fe8-6f1f-498f-be00-420b8e788b70">
Oct 01 14:21:36 compute-0 nova_compute[192698]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Oct 01 14:21:36 compute-0 nova_compute[192698]:         </nova:port>
Oct 01 14:21:36 compute-0 nova_compute[192698]:       </nova:ports>
Oct 01 14:21:36 compute-0 nova_compute[192698]:     </nova:instance>
Oct 01 14:21:36 compute-0 nova_compute[192698]:   </metadata>
Oct 01 14:21:36 compute-0 nova_compute[192698]:   <sysinfo type="smbios">
Oct 01 14:21:36 compute-0 nova_compute[192698]:     <system>
Oct 01 14:21:36 compute-0 nova_compute[192698]:       <entry name="manufacturer">RDO</entry>
Oct 01 14:21:36 compute-0 nova_compute[192698]:       <entry name="product">OpenStack Compute</entry>
Oct 01 14:21:36 compute-0 nova_compute[192698]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Oct 01 14:21:36 compute-0 nova_compute[192698]:       <entry name="serial">57466778-4bb3-4165-9a9f-bfca9f200d03</entry>
Oct 01 14:21:36 compute-0 nova_compute[192698]:       <entry name="uuid">57466778-4bb3-4165-9a9f-bfca9f200d03</entry>
Oct 01 14:21:36 compute-0 nova_compute[192698]:       <entry name="family">Virtual Machine</entry>
Oct 01 14:21:36 compute-0 nova_compute[192698]:     </system>
Oct 01 14:21:36 compute-0 nova_compute[192698]:   </sysinfo>
Oct 01 14:21:36 compute-0 nova_compute[192698]:   <os>
Oct 01 14:21:36 compute-0 nova_compute[192698]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 01 14:21:36 compute-0 nova_compute[192698]:     <boot dev="hd"/>
Oct 01 14:21:36 compute-0 nova_compute[192698]:     <smbios mode="sysinfo"/>
Oct 01 14:21:36 compute-0 nova_compute[192698]:   </os>
Oct 01 14:21:36 compute-0 nova_compute[192698]:   <features>
Oct 01 14:21:36 compute-0 nova_compute[192698]:     <acpi/>
Oct 01 14:21:36 compute-0 nova_compute[192698]:     <apic/>
Oct 01 14:21:36 compute-0 nova_compute[192698]:     <vmcoreinfo/>
Oct 01 14:21:36 compute-0 nova_compute[192698]:   </features>
Oct 01 14:21:36 compute-0 nova_compute[192698]:   <clock offset="utc">
Oct 01 14:21:36 compute-0 nova_compute[192698]:     <timer name="pit" tickpolicy="delay"/>
Oct 01 14:21:36 compute-0 nova_compute[192698]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 01 14:21:36 compute-0 nova_compute[192698]:     <timer name="hpet" present="no"/>
Oct 01 14:21:36 compute-0 nova_compute[192698]:   </clock>
Oct 01 14:21:36 compute-0 nova_compute[192698]:   <cpu mode="host-model" match="exact">
Oct 01 14:21:36 compute-0 nova_compute[192698]:     <topology sockets="1" cores="1" threads="1"/>
Oct 01 14:21:36 compute-0 nova_compute[192698]:   </cpu>
Oct 01 14:21:36 compute-0 nova_compute[192698]:   <devices>
Oct 01 14:21:36 compute-0 nova_compute[192698]:     <disk type="file" device="disk">
Oct 01 14:21:36 compute-0 nova_compute[192698]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 01 14:21:36 compute-0 nova_compute[192698]:       <source file="/var/lib/nova/instances/57466778-4bb3-4165-9a9f-bfca9f200d03/disk"/>
Oct 01 14:21:36 compute-0 nova_compute[192698]:       <target dev="vda" bus="virtio"/>
Oct 01 14:21:36 compute-0 nova_compute[192698]:     </disk>
Oct 01 14:21:36 compute-0 nova_compute[192698]:     <disk type="file" device="cdrom">
Oct 01 14:21:36 compute-0 nova_compute[192698]:       <driver name="qemu" type="raw" cache="none"/>
Oct 01 14:21:36 compute-0 nova_compute[192698]:       <source file="/var/lib/nova/instances/57466778-4bb3-4165-9a9f-bfca9f200d03/disk.config"/>
Oct 01 14:21:36 compute-0 nova_compute[192698]:       <target dev="sda" bus="sata"/>
Oct 01 14:21:36 compute-0 nova_compute[192698]:     </disk>
Oct 01 14:21:36 compute-0 nova_compute[192698]:     <interface type="ethernet">
Oct 01 14:21:36 compute-0 nova_compute[192698]:       <mac address="fa:16:3e:83:bd:d6"/>
Oct 01 14:21:36 compute-0 nova_compute[192698]:       <model type="virtio"/>
Oct 01 14:21:36 compute-0 nova_compute[192698]:       <driver name="vhost" rx_queue_size="512"/>
Oct 01 14:21:36 compute-0 nova_compute[192698]:       <mtu size="1442"/>
Oct 01 14:21:36 compute-0 nova_compute[192698]:       <target dev="tapf7e77fe8-6f"/>
Oct 01 14:21:36 compute-0 nova_compute[192698]:     </interface>
Oct 01 14:21:36 compute-0 nova_compute[192698]:     <serial type="pty">
Oct 01 14:21:36 compute-0 nova_compute[192698]:       <log file="/var/lib/nova/instances/57466778-4bb3-4165-9a9f-bfca9f200d03/console.log" append="off"/>
Oct 01 14:21:36 compute-0 nova_compute[192698]:     </serial>
Oct 01 14:21:36 compute-0 nova_compute[192698]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 01 14:21:36 compute-0 nova_compute[192698]:     <video>
Oct 01 14:21:36 compute-0 nova_compute[192698]:       <model type="virtio"/>
Oct 01 14:21:36 compute-0 nova_compute[192698]:     </video>
Oct 01 14:21:36 compute-0 nova_compute[192698]:     <input type="tablet" bus="usb"/>
Oct 01 14:21:36 compute-0 nova_compute[192698]:     <rng model="virtio">
Oct 01 14:21:36 compute-0 nova_compute[192698]:       <backend model="random">/dev/urandom</backend>
Oct 01 14:21:36 compute-0 nova_compute[192698]:     </rng>
Oct 01 14:21:36 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root"/>
Oct 01 14:21:36 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:21:36 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:21:36 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:21:36 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:21:36 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:21:36 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:21:36 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:21:36 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:21:36 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:21:36 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:21:36 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:21:36 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:21:36 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:21:36 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:21:36 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:21:36 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:21:36 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:21:36 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:21:36 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:21:36 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:21:36 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:21:36 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:21:36 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:21:36 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:21:36 compute-0 nova_compute[192698]:     <controller type="usb" index="0"/>
Oct 01 14:21:36 compute-0 nova_compute[192698]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Oct 01 14:21:36 compute-0 nova_compute[192698]:       <stats period="10"/>
Oct 01 14:21:36 compute-0 nova_compute[192698]:     </memballoon>
Oct 01 14:21:36 compute-0 nova_compute[192698]:   </devices>
Oct 01 14:21:36 compute-0 nova_compute[192698]: </domain>
Oct 01 14:21:36 compute-0 nova_compute[192698]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Oct 01 14:21:36 compute-0 nova_compute[192698]: 2025-10-01 14:21:36.478 2 DEBUG nova.compute.manager [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] Preparing to wait for external event network-vif-plugged-f7e77fe8-6f1f-498f-be00-420b8e788b70 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Oct 01 14:21:36 compute-0 nova_compute[192698]: 2025-10-01 14:21:36.479 2 DEBUG oslo_concurrency.lockutils [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Acquiring lock "57466778-4bb3-4165-9a9f-bfca9f200d03-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:21:36 compute-0 nova_compute[192698]: 2025-10-01 14:21:36.479 2 DEBUG oslo_concurrency.lockutils [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "57466778-4bb3-4165-9a9f-bfca9f200d03-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:21:36 compute-0 nova_compute[192698]: 2025-10-01 14:21:36.480 2 DEBUG oslo_concurrency.lockutils [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "57466778-4bb3-4165-9a9f-bfca9f200d03-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:21:36 compute-0 nova_compute[192698]: 2025-10-01 14:21:36.481 2 DEBUG nova.virt.libvirt.vif [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-01T14:21:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-678066720',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-678066720',id=21,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d43115e3729442e1b68b749acc0dabc8',ramdisk_id='',reservation_id='r-fih772z7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-30131345',owner_user_name='tempest-TestExecuteStrategies-30131345-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-01T14:21:27Z,user_data=None,user_id='f8897741e6ca4770b56d28d05fa3fc42',uuid=57466778-4bb3-4165-9a9f-bfca9f200d03,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f7e77fe8-6f1f-498f-be00-420b8e788b70", "address": "fa:16:3e:83:bd:d6", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7e77fe8-6f", "ovs_interfaceid": "f7e77fe8-6f1f-498f-be00-420b8e788b70", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 01 14:21:36 compute-0 nova_compute[192698]: 2025-10-01 14:21:36.481 2 DEBUG nova.network.os_vif_util [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Converting VIF {"id": "f7e77fe8-6f1f-498f-be00-420b8e788b70", "address": "fa:16:3e:83:bd:d6", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7e77fe8-6f", "ovs_interfaceid": "f7e77fe8-6f1f-498f-be00-420b8e788b70", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:21:36 compute-0 nova_compute[192698]: 2025-10-01 14:21:36.482 2 DEBUG nova.network.os_vif_util [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:83:bd:d6,bridge_name='br-int',has_traffic_filtering=True,id=f7e77fe8-6f1f-498f-be00-420b8e788b70,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7e77fe8-6f') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:21:36 compute-0 nova_compute[192698]: 2025-10-01 14:21:36.482 2 DEBUG os_vif [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:83:bd:d6,bridge_name='br-int',has_traffic_filtering=True,id=f7e77fe8-6f1f-498f-be00-420b8e788b70,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7e77fe8-6f') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 01 14:21:36 compute-0 nova_compute[192698]: 2025-10-01 14:21:36.483 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:21:36 compute-0 nova_compute[192698]: 2025-10-01 14:21:36.484 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:21:36 compute-0 nova_compute[192698]: 2025-10-01 14:21:36.484 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:21:36 compute-0 nova_compute[192698]: 2025-10-01 14:21:36.485 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:21:36 compute-0 nova_compute[192698]: 2025-10-01 14:21:36.485 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '10b843f0-1e3d-5a13-aa1b-b4e0fac1229e', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:21:36 compute-0 nova_compute[192698]: 2025-10-01 14:21:36.487 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:21:36 compute-0 nova_compute[192698]: 2025-10-01 14:21:36.489 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:21:36 compute-0 nova_compute[192698]: 2025-10-01 14:21:36.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:21:36 compute-0 nova_compute[192698]: 2025-10-01 14:21:36.493 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf7e77fe8-6f, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:21:36 compute-0 nova_compute[192698]: 2025-10-01 14:21:36.494 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapf7e77fe8-6f, col_values=(('qos', UUID('8e2adb6b-7b46-4f29-9a9a-3161b6754467')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:21:36 compute-0 nova_compute[192698]: 2025-10-01 14:21:36.494 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapf7e77fe8-6f, col_values=(('external_ids', {'iface-id': 'f7e77fe8-6f1f-498f-be00-420b8e788b70', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:83:bd:d6', 'vm-uuid': '57466778-4bb3-4165-9a9f-bfca9f200d03'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:21:36 compute-0 NetworkManager[51741]: <info>  [1759328496.4971] manager: (tapf7e77fe8-6f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/64)
Oct 01 14:21:36 compute-0 nova_compute[192698]: 2025-10-01 14:21:36.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:21:36 compute-0 nova_compute[192698]: 2025-10-01 14:21:36.498 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:21:36 compute-0 nova_compute[192698]: 2025-10-01 14:21:36.506 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:21:36 compute-0 nova_compute[192698]: 2025-10-01 14:21:36.507 2 INFO os_vif [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:83:bd:d6,bridge_name='br-int',has_traffic_filtering=True,id=f7e77fe8-6f1f-498f-be00-420b8e788b70,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7e77fe8-6f')
Oct 01 14:21:38 compute-0 nova_compute[192698]: 2025-10-01 14:21:38.062 2 DEBUG nova.virt.libvirt.driver [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 01 14:21:38 compute-0 nova_compute[192698]: 2025-10-01 14:21:38.063 2 DEBUG nova.virt.libvirt.driver [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 01 14:21:38 compute-0 nova_compute[192698]: 2025-10-01 14:21:38.063 2 DEBUG nova.virt.libvirt.driver [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] No VIF found with MAC fa:16:3e:83:bd:d6, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Oct 01 14:21:38 compute-0 nova_compute[192698]: 2025-10-01 14:21:38.064 2 INFO nova.virt.libvirt.driver [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] Using config drive
Oct 01 14:21:38 compute-0 nova_compute[192698]: 2025-10-01 14:21:38.577 2 WARNING neutronclient.v2_0.client [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:21:39 compute-0 nova_compute[192698]: 2025-10-01 14:21:39.046 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:21:39 compute-0 nova_compute[192698]: 2025-10-01 14:21:39.441 2 INFO nova.virt.libvirt.driver [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] Creating config drive at /var/lib/nova/instances/57466778-4bb3-4165-9a9f-bfca9f200d03/disk.config
Oct 01 14:21:39 compute-0 nova_compute[192698]: 2025-10-01 14:21:39.450 2 DEBUG oslo_concurrency.processutils [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/57466778-4bb3-4165-9a9f-bfca9f200d03/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmp_6dpc5zf execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:21:39 compute-0 nova_compute[192698]: 2025-10-01 14:21:39.596 2 DEBUG oslo_concurrency.processutils [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/57466778-4bb3-4165-9a9f-bfca9f200d03/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmp_6dpc5zf" returned: 0 in 0.146s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:21:39 compute-0 kernel: tapf7e77fe8-6f: entered promiscuous mode
Oct 01 14:21:39 compute-0 NetworkManager[51741]: <info>  [1759328499.6884] manager: (tapf7e77fe8-6f): new Tun device (/org/freedesktop/NetworkManager/Devices/65)
Oct 01 14:21:39 compute-0 ovn_controller[94909]: 2025-10-01T14:21:39Z|00162|binding|INFO|Claiming lport f7e77fe8-6f1f-498f-be00-420b8e788b70 for this chassis.
Oct 01 14:21:39 compute-0 ovn_controller[94909]: 2025-10-01T14:21:39Z|00163|binding|INFO|f7e77fe8-6f1f-498f-be00-420b8e788b70: Claiming fa:16:3e:83:bd:d6 10.100.0.9
Oct 01 14:21:39 compute-0 nova_compute[192698]: 2025-10-01 14:21:39.687 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:21:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:21:39.696 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:83:bd:d6 10.100.0.9'], port_security=['fa:16:3e:83:bd:d6 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '57466778-4bb3-4165-9a9f-bfca9f200d03', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-031a8987-8430-4fb6-a464-01e4dca2fae7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd43115e3729442e1b68b749acc0dabc8', 'neutron:revision_number': '4', 'neutron:security_group_ids': '43a3232d-93b1-43af-a9a3-1fde49b4460d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bd1914da-f1b0-4097-9d6b-24a3870871dc, chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], logical_port=f7e77fe8-6f1f-498f-be00-420b8e788b70) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:21:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:21:39.697 103791 INFO neutron.agent.ovn.metadata.agent [-] Port f7e77fe8-6f1f-498f-be00-420b8e788b70 in datapath 031a8987-8430-4fb6-a464-01e4dca2fae7 bound to our chassis
Oct 01 14:21:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:21:39.698 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 031a8987-8430-4fb6-a464-01e4dca2fae7
Oct 01 14:21:39 compute-0 ovn_controller[94909]: 2025-10-01T14:21:39Z|00164|binding|INFO|Setting lport f7e77fe8-6f1f-498f-be00-420b8e788b70 ovn-installed in OVS
Oct 01 14:21:39 compute-0 ovn_controller[94909]: 2025-10-01T14:21:39Z|00165|binding|INFO|Setting lport f7e77fe8-6f1f-498f-be00-420b8e788b70 up in Southbound
Oct 01 14:21:39 compute-0 nova_compute[192698]: 2025-10-01 14:21:39.706 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:21:39 compute-0 nova_compute[192698]: 2025-10-01 14:21:39.709 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:21:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:21:39.716 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[75d34597-8a60-4eba-98aa-755a42efb05a]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:21:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:21:39.717 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap031a8987-81 in ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Oct 01 14:21:39 compute-0 systemd-udevd[223223]: Network interface NamePolicy= disabled on kernel command line.
Oct 01 14:21:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:21:39.720 214114 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap031a8987-80 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Oct 01 14:21:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:21:39.720 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[1e3da33d-886b-4fc1-95b6-b70d894a623b]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:21:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:21:39.720 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[d94287be-3d32-44a3-9822-67f4e24a3fa2]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:21:39 compute-0 NetworkManager[51741]: <info>  [1759328499.7340] device (tapf7e77fe8-6f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 01 14:21:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:21:39.735 103910 DEBUG oslo.privsep.daemon [-] privsep: reply[9bf33a6f-79b3-421f-bce5-ed929939756a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:21:39 compute-0 NetworkManager[51741]: <info>  [1759328499.7361] device (tapf7e77fe8-6f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 01 14:21:39 compute-0 systemd-machined[152704]: New machine qemu-15-instance-00000015.
Oct 01 14:21:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:21:39.751 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[5e65744c-582f-4991-8e7f-a5cb48f3b540]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:21:39 compute-0 systemd[1]: Started Virtual Machine qemu-15-instance-00000015.
Oct 01 14:21:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:21:39.788 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[a3d18eaf-5971-4597-802a-576dd5c5debd]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:21:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:21:39.793 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[55707cff-3a4b-44dc-8955-9950864642a0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:21:39 compute-0 systemd-udevd[223228]: Network interface NamePolicy= disabled on kernel command line.
Oct 01 14:21:39 compute-0 NetworkManager[51741]: <info>  [1759328499.7966] manager: (tap031a8987-80): new Veth device (/org/freedesktop/NetworkManager/Devices/66)
Oct 01 14:21:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:21:39.830 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[0545a382-1588-41c6-83ef-3e95978869da]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:21:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:21:39.833 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[5cfb778a-9983-4248-ab3f-058f3097f5b1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:21:39 compute-0 NetworkManager[51741]: <info>  [1759328499.8628] device (tap031a8987-80): carrier: link connected
Oct 01 14:21:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:21:39.875 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[1ebe506a-755f-4327-9fc8-058c621dd814]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:21:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:21:39.897 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[e32c0868-c33d-45b1-ba1f-dcff4fd76a25]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap031a8987-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:79:6c:81'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 486039, 'reachable_time': 19247, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223257, 'error': None, 'target': 'ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:21:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:21:39.916 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[807b7464-0ed7-4f3d-ad97-7b5857eb37f9]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe79:6c81'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 486039, 'tstamp': 486039}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223258, 'error': None, 'target': 'ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:21:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:21:39.936 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[0c2124c9-4906-4c29-b916-ac1bfc0e8788]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap031a8987-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:79:6c:81'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 486039, 'reachable_time': 19247, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 223259, 'error': None, 'target': 'ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:21:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:21:39.978 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[ac139b84-3e03-424f-b3fa-091cb6d436db]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:21:40 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:21:40.045 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[83d7e829-7925-46ea-b7e5-1880841a5827]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:21:40 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:21:40.046 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap031a8987-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:21:40 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:21:40.047 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:21:40 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:21:40.047 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap031a8987-80, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:21:40 compute-0 nova_compute[192698]: 2025-10-01 14:21:40.049 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:21:40 compute-0 NetworkManager[51741]: <info>  [1759328500.0500] manager: (tap031a8987-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/67)
Oct 01 14:21:40 compute-0 kernel: tap031a8987-80: entered promiscuous mode
Oct 01 14:21:40 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:21:40.052 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap031a8987-80, col_values=(('external_ids', {'iface-id': '6dd814dc-cba2-4392-85ef-eadb8c4615f7'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:21:40 compute-0 nova_compute[192698]: 2025-10-01 14:21:40.052 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:21:40 compute-0 nova_compute[192698]: 2025-10-01 14:21:40.053 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:21:40 compute-0 ovn_controller[94909]: 2025-10-01T14:21:40Z|00166|binding|INFO|Releasing lport 6dd814dc-cba2-4392-85ef-eadb8c4615f7 from this chassis (sb_readonly=0)
Oct 01 14:21:40 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:21:40.054 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[0db5759c-514a-4a0e-8096-b21df304df23]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:21:40 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:21:40.055 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:21:40 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:21:40.055 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:21:40 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:21:40.055 103791 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 031a8987-8430-4fb6-a464-01e4dca2fae7 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Oct 01 14:21:40 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:21:40.055 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:21:40 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:21:40.055 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[03fe4e73-de89-45a9-94e6-cc994de644b9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:21:40 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:21:40.056 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:21:40 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:21:40.056 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[2f2e672f-ece1-4fac-9e1a-87deb03acfb5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:21:40 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:21:40.057 103791 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Oct 01 14:21:40 compute-0 ovn_metadata_agent[103777]: global
Oct 01 14:21:40 compute-0 ovn_metadata_agent[103777]:     log         /dev/log local0 debug
Oct 01 14:21:40 compute-0 ovn_metadata_agent[103777]:     log-tag     haproxy-metadata-proxy-031a8987-8430-4fb6-a464-01e4dca2fae7
Oct 01 14:21:40 compute-0 ovn_metadata_agent[103777]:     user        root
Oct 01 14:21:40 compute-0 ovn_metadata_agent[103777]:     group       root
Oct 01 14:21:40 compute-0 ovn_metadata_agent[103777]:     maxconn     1024
Oct 01 14:21:40 compute-0 ovn_metadata_agent[103777]:     pidfile     /var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy
Oct 01 14:21:40 compute-0 ovn_metadata_agent[103777]:     daemon
Oct 01 14:21:40 compute-0 ovn_metadata_agent[103777]: 
Oct 01 14:21:40 compute-0 ovn_metadata_agent[103777]: defaults
Oct 01 14:21:40 compute-0 ovn_metadata_agent[103777]:     log global
Oct 01 14:21:40 compute-0 ovn_metadata_agent[103777]:     mode http
Oct 01 14:21:40 compute-0 ovn_metadata_agent[103777]:     option httplog
Oct 01 14:21:40 compute-0 ovn_metadata_agent[103777]:     option dontlognull
Oct 01 14:21:40 compute-0 ovn_metadata_agent[103777]:     option http-server-close
Oct 01 14:21:40 compute-0 ovn_metadata_agent[103777]:     option forwardfor
Oct 01 14:21:40 compute-0 ovn_metadata_agent[103777]:     retries                 3
Oct 01 14:21:40 compute-0 ovn_metadata_agent[103777]:     timeout http-request    30s
Oct 01 14:21:40 compute-0 ovn_metadata_agent[103777]:     timeout connect         30s
Oct 01 14:21:40 compute-0 ovn_metadata_agent[103777]:     timeout client          32s
Oct 01 14:21:40 compute-0 ovn_metadata_agent[103777]:     timeout server          32s
Oct 01 14:21:40 compute-0 ovn_metadata_agent[103777]:     timeout http-keep-alive 30s
Oct 01 14:21:40 compute-0 ovn_metadata_agent[103777]: 
Oct 01 14:21:40 compute-0 ovn_metadata_agent[103777]: listen listener
Oct 01 14:21:40 compute-0 ovn_metadata_agent[103777]:     bind 169.254.169.254:80
Oct 01 14:21:40 compute-0 ovn_metadata_agent[103777]:     
Oct 01 14:21:40 compute-0 ovn_metadata_agent[103777]:     server metadata /var/lib/neutron/metadata_proxy
Oct 01 14:21:40 compute-0 ovn_metadata_agent[103777]: 
Oct 01 14:21:40 compute-0 ovn_metadata_agent[103777]:     http-request add-header X-OVN-Network-ID 031a8987-8430-4fb6-a464-01e4dca2fae7
Oct 01 14:21:40 compute-0 ovn_metadata_agent[103777]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Oct 01 14:21:40 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:21:40.057 103791 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7', 'env', 'PROCESS_TAG=haproxy-031a8987-8430-4fb6-a464-01e4dca2fae7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/031a8987-8430-4fb6-a464-01e4dca2fae7.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Oct 01 14:21:40 compute-0 nova_compute[192698]: 2025-10-01 14:21:40.066 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:21:40 compute-0 nova_compute[192698]: 2025-10-01 14:21:40.493 2 DEBUG nova.compute.manager [req-712269c7-0f8e-4574-ac9b-7a273eac9bbe req-dfe3266b-1825-46bc-b551-60ab6687d8e7 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] Received event network-vif-plugged-f7e77fe8-6f1f-498f-be00-420b8e788b70 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:21:40 compute-0 nova_compute[192698]: 2025-10-01 14:21:40.495 2 DEBUG oslo_concurrency.lockutils [req-712269c7-0f8e-4574-ac9b-7a273eac9bbe req-dfe3266b-1825-46bc-b551-60ab6687d8e7 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "57466778-4bb3-4165-9a9f-bfca9f200d03-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:21:40 compute-0 nova_compute[192698]: 2025-10-01 14:21:40.495 2 DEBUG oslo_concurrency.lockutils [req-712269c7-0f8e-4574-ac9b-7a273eac9bbe req-dfe3266b-1825-46bc-b551-60ab6687d8e7 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "57466778-4bb3-4165-9a9f-bfca9f200d03-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:21:40 compute-0 nova_compute[192698]: 2025-10-01 14:21:40.496 2 DEBUG oslo_concurrency.lockutils [req-712269c7-0f8e-4574-ac9b-7a273eac9bbe req-dfe3266b-1825-46bc-b551-60ab6687d8e7 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "57466778-4bb3-4165-9a9f-bfca9f200d03-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:21:40 compute-0 nova_compute[192698]: 2025-10-01 14:21:40.496 2 DEBUG nova.compute.manager [req-712269c7-0f8e-4574-ac9b-7a273eac9bbe req-dfe3266b-1825-46bc-b551-60ab6687d8e7 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] Processing event network-vif-plugged-f7e77fe8-6f1f-498f-be00-420b8e788b70 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Oct 01 14:21:40 compute-0 podman[223298]: 2025-10-01 14:21:40.516339249 +0000 UTC m=+0.072614399 container create d743bff3e05e12ecdfbba4cf8ae265217b4712d1fc5cc252df4cf9426e90a1bc (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Oct 01 14:21:40 compute-0 podman[223298]: 2025-10-01 14:21:40.473220876 +0000 UTC m=+0.029496056 image pull 0c139338a67144a0d88e07ef5f38b20d3085af4a1586fd8115d3776c8f9c633c 38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Oct 01 14:21:40 compute-0 systemd[1]: Started libpod-conmon-d743bff3e05e12ecdfbba4cf8ae265217b4712d1fc5cc252df4cf9426e90a1bc.scope.
Oct 01 14:21:40 compute-0 systemd[1]: Started libcrun container.
Oct 01 14:21:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5d048d3162a6751c766848aa168ac8d4fba0b283f6091790a6d3d68d95ebafa4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 01 14:21:40 compute-0 podman[223298]: 2025-10-01 14:21:40.705943623 +0000 UTC m=+0.262218853 container init d743bff3e05e12ecdfbba4cf8ae265217b4712d1fc5cc252df4cf9426e90a1bc (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 01 14:21:40 compute-0 podman[223298]: 2025-10-01 14:21:40.721622076 +0000 UTC m=+0.277897256 container start d743bff3e05e12ecdfbba4cf8ae265217b4712d1fc5cc252df4cf9426e90a1bc (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team)
Oct 01 14:21:40 compute-0 neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7[223313]: [NOTICE]   (223317) : New worker (223319) forked
Oct 01 14:21:40 compute-0 neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7[223313]: [NOTICE]   (223317) : Loading success.
Oct 01 14:21:40 compute-0 nova_compute[192698]: 2025-10-01 14:21:40.909 2 DEBUG nova.compute.manager [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Oct 01 14:21:40 compute-0 nova_compute[192698]: 2025-10-01 14:21:40.914 2 DEBUG nova.virt.libvirt.driver [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Oct 01 14:21:40 compute-0 nova_compute[192698]: 2025-10-01 14:21:40.918 2 INFO nova.virt.libvirt.driver [-] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] Instance spawned successfully.
Oct 01 14:21:40 compute-0 nova_compute[192698]: 2025-10-01 14:21:40.918 2 DEBUG nova.virt.libvirt.driver [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Oct 01 14:21:41 compute-0 nova_compute[192698]: 2025-10-01 14:21:41.432 2 DEBUG nova.virt.libvirt.driver [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:21:41 compute-0 nova_compute[192698]: 2025-10-01 14:21:41.432 2 DEBUG nova.virt.libvirt.driver [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:21:41 compute-0 nova_compute[192698]: 2025-10-01 14:21:41.433 2 DEBUG nova.virt.libvirt.driver [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:21:41 compute-0 nova_compute[192698]: 2025-10-01 14:21:41.434 2 DEBUG nova.virt.libvirt.driver [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:21:41 compute-0 nova_compute[192698]: 2025-10-01 14:21:41.434 2 DEBUG nova.virt.libvirt.driver [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:21:41 compute-0 nova_compute[192698]: 2025-10-01 14:21:41.435 2 DEBUG nova.virt.libvirt.driver [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:21:41 compute-0 nova_compute[192698]: 2025-10-01 14:21:41.498 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:21:41 compute-0 nova_compute[192698]: 2025-10-01 14:21:41.945 2 INFO nova.compute.manager [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] Took 13.93 seconds to spawn the instance on the hypervisor.
Oct 01 14:21:41 compute-0 nova_compute[192698]: 2025-10-01 14:21:41.946 2 DEBUG nova.compute.manager [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 01 14:21:42 compute-0 nova_compute[192698]: 2025-10-01 14:21:42.489 2 INFO nova.compute.manager [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] Took 19.90 seconds to build instance.
Oct 01 14:21:42 compute-0 nova_compute[192698]: 2025-10-01 14:21:42.562 2 DEBUG nova.compute.manager [req-3f5ad315-6c26-4042-bf88-e5df1d384f40 req-770d87c3-62f6-4e13-af50-8cdaf24a1896 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] Received event network-vif-plugged-f7e77fe8-6f1f-498f-be00-420b8e788b70 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:21:42 compute-0 nova_compute[192698]: 2025-10-01 14:21:42.563 2 DEBUG oslo_concurrency.lockutils [req-3f5ad315-6c26-4042-bf88-e5df1d384f40 req-770d87c3-62f6-4e13-af50-8cdaf24a1896 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "57466778-4bb3-4165-9a9f-bfca9f200d03-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:21:42 compute-0 nova_compute[192698]: 2025-10-01 14:21:42.564 2 DEBUG oslo_concurrency.lockutils [req-3f5ad315-6c26-4042-bf88-e5df1d384f40 req-770d87c3-62f6-4e13-af50-8cdaf24a1896 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "57466778-4bb3-4165-9a9f-bfca9f200d03-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:21:42 compute-0 nova_compute[192698]: 2025-10-01 14:21:42.564 2 DEBUG oslo_concurrency.lockutils [req-3f5ad315-6c26-4042-bf88-e5df1d384f40 req-770d87c3-62f6-4e13-af50-8cdaf24a1896 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "57466778-4bb3-4165-9a9f-bfca9f200d03-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:21:42 compute-0 nova_compute[192698]: 2025-10-01 14:21:42.565 2 DEBUG nova.compute.manager [req-3f5ad315-6c26-4042-bf88-e5df1d384f40 req-770d87c3-62f6-4e13-af50-8cdaf24a1896 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] No waiting events found dispatching network-vif-plugged-f7e77fe8-6f1f-498f-be00-420b8e788b70 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:21:42 compute-0 nova_compute[192698]: 2025-10-01 14:21:42.565 2 WARNING nova.compute.manager [req-3f5ad315-6c26-4042-bf88-e5df1d384f40 req-770d87c3-62f6-4e13-af50-8cdaf24a1896 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] Received unexpected event network-vif-plugged-f7e77fe8-6f1f-498f-be00-420b8e788b70 for instance with vm_state active and task_state None.
Oct 01 14:21:42 compute-0 nova_compute[192698]: 2025-10-01 14:21:42.995 2 DEBUG oslo_concurrency.lockutils [None req-fe933323-159d-46dd-b0ce-5cb2f8303df6 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "57466778-4bb3-4165-9a9f-bfca9f200d03" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 21.418s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:21:44 compute-0 nova_compute[192698]: 2025-10-01 14:21:44.051 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:21:44 compute-0 podman[223329]: 2025-10-01 14:21:44.150523881 +0000 UTC m=+0.062136716 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 01 14:21:44 compute-0 podman[223330]: 2025-10-01 14:21:44.187898429 +0000 UTC m=+0.099629278 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct 01 14:21:46 compute-0 nova_compute[192698]: 2025-10-01 14:21:46.503 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:21:49 compute-0 nova_compute[192698]: 2025-10-01 14:21:49.093 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:21:51 compute-0 podman[223374]: 2025-10-01 14:21:51.193825235 +0000 UTC m=+0.099787792 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, managed_by=edpm_ansible, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vcs-type=git, vendor=Red Hat, Inc.)
Oct 01 14:21:51 compute-0 nova_compute[192698]: 2025-10-01 14:21:51.509 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:21:52 compute-0 ovn_controller[94909]: 2025-10-01T14:21:52Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:83:bd:d6 10.100.0.9
Oct 01 14:21:52 compute-0 ovn_controller[94909]: 2025-10-01T14:21:52Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:83:bd:d6 10.100.0.9
Oct 01 14:21:54 compute-0 nova_compute[192698]: 2025-10-01 14:21:54.095 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:21:54 compute-0 nova_compute[192698]: 2025-10-01 14:21:54.513 2 DEBUG nova.virt.libvirt.driver [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: c11e5fa5-006c-4cf2-a3c6-8da08c4596c9] Creating tmpfile /var/lib/nova/instances/tmprmgcxqib to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Oct 01 14:21:54 compute-0 nova_compute[192698]: 2025-10-01 14:21:54.515 2 WARNING neutronclient.v2_0.client [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:21:54 compute-0 nova_compute[192698]: 2025-10-01 14:21:54.532 2 DEBUG nova.compute.manager [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmprmgcxqib',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Oct 01 14:21:56 compute-0 nova_compute[192698]: 2025-10-01 14:21:56.512 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:21:56 compute-0 nova_compute[192698]: 2025-10-01 14:21:56.575 2 WARNING neutronclient.v2_0.client [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:21:57 compute-0 podman[223415]: 2025-10-01 14:21:57.178887659 +0000 UTC m=+0.095169278 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, container_name=iscsid, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct 01 14:21:57 compute-0 podman[223416]: 2025-10-01 14:21:57.191219071 +0000 UTC m=+0.089291749 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20250930)
Oct 01 14:21:59 compute-0 nova_compute[192698]: 2025-10-01 14:21:59.098 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:21:59 compute-0 podman[203144]: time="2025-10-01T14:21:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:21:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:21:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20750 "" "Go-http-client/1.1"
Oct 01 14:21:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:21:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3489 "" "Go-http-client/1.1"
Oct 01 14:22:00 compute-0 nova_compute[192698]: 2025-10-01 14:22:00.553 2 DEBUG nova.compute.manager [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmprmgcxqib',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='c11e5fa5-006c-4cf2-a3c6-8da08c4596c9',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Oct 01 14:22:01 compute-0 openstack_network_exporter[205307]: ERROR   14:22:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:22:01 compute-0 openstack_network_exporter[205307]: ERROR   14:22:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:22:01 compute-0 openstack_network_exporter[205307]: ERROR   14:22:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:22:01 compute-0 openstack_network_exporter[205307]: ERROR   14:22:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:22:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:22:01 compute-0 openstack_network_exporter[205307]: ERROR   14:22:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:22:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:22:01 compute-0 nova_compute[192698]: 2025-10-01 14:22:01.514 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:22:01 compute-0 nova_compute[192698]: 2025-10-01 14:22:01.568 2 DEBUG oslo_concurrency.lockutils [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "refresh_cache-c11e5fa5-006c-4cf2-a3c6-8da08c4596c9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 01 14:22:01 compute-0 nova_compute[192698]: 2025-10-01 14:22:01.569 2 DEBUG oslo_concurrency.lockutils [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquired lock "refresh_cache-c11e5fa5-006c-4cf2-a3c6-8da08c4596c9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 01 14:22:01 compute-0 nova_compute[192698]: 2025-10-01 14:22:01.569 2 DEBUG nova.network.neutron [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: c11e5fa5-006c-4cf2-a3c6-8da08c4596c9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 01 14:22:02 compute-0 nova_compute[192698]: 2025-10-01 14:22:02.075 2 WARNING neutronclient.v2_0.client [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:22:02 compute-0 nova_compute[192698]: 2025-10-01 14:22:02.644 2 WARNING neutronclient.v2_0.client [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:22:02 compute-0 nova_compute[192698]: 2025-10-01 14:22:02.837 2 DEBUG nova.network.neutron [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: c11e5fa5-006c-4cf2-a3c6-8da08c4596c9] Updating instance_info_cache with network_info: [{"id": "d274a684-bfdc-4a97-977d-654f7d54585b", "address": "fa:16:3e:54:70:aa", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd274a684-bf", "ovs_interfaceid": "d274a684-bfdc-4a97-977d-654f7d54585b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:22:03 compute-0 podman[223455]: 2025-10-01 14:22:03.165944395 +0000 UTC m=+0.077725007 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 01 14:22:03 compute-0 nova_compute[192698]: 2025-10-01 14:22:03.344 2 DEBUG oslo_concurrency.lockutils [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Releasing lock "refresh_cache-c11e5fa5-006c-4cf2-a3c6-8da08c4596c9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 01 14:22:03 compute-0 nova_compute[192698]: 2025-10-01 14:22:03.363 2 DEBUG nova.virt.libvirt.driver [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: c11e5fa5-006c-4cf2-a3c6-8da08c4596c9] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmprmgcxqib',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='c11e5fa5-006c-4cf2-a3c6-8da08c4596c9',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Oct 01 14:22:03 compute-0 nova_compute[192698]: 2025-10-01 14:22:03.364 2 DEBUG nova.virt.libvirt.driver [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: c11e5fa5-006c-4cf2-a3c6-8da08c4596c9] Creating instance directory: /var/lib/nova/instances/c11e5fa5-006c-4cf2-a3c6-8da08c4596c9 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Oct 01 14:22:03 compute-0 nova_compute[192698]: 2025-10-01 14:22:03.365 2 DEBUG nova.virt.libvirt.driver [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: c11e5fa5-006c-4cf2-a3c6-8da08c4596c9] Creating disk.info with the contents: {'/var/lib/nova/instances/c11e5fa5-006c-4cf2-a3c6-8da08c4596c9/disk': 'qcow2', '/var/lib/nova/instances/c11e5fa5-006c-4cf2-a3c6-8da08c4596c9/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Oct 01 14:22:03 compute-0 nova_compute[192698]: 2025-10-01 14:22:03.365 2 DEBUG nova.virt.libvirt.driver [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: c11e5fa5-006c-4cf2-a3c6-8da08c4596c9] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Oct 01 14:22:03 compute-0 nova_compute[192698]: 2025-10-01 14:22:03.366 2 DEBUG nova.objects.instance [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lazy-loading 'trusted_certs' on Instance uuid c11e5fa5-006c-4cf2-a3c6-8da08c4596c9 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:22:03 compute-0 nova_compute[192698]: 2025-10-01 14:22:03.875 2 DEBUG oslo_utils.imageutils.format_inspector [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:22:03 compute-0 nova_compute[192698]: 2025-10-01 14:22:03.880 2 DEBUG oslo_utils.imageutils.format_inspector [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:22:03 compute-0 nova_compute[192698]: 2025-10-01 14:22:03.884 2 DEBUG oslo_concurrency.processutils [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:22:03 compute-0 nova_compute[192698]: 2025-10-01 14:22:03.974 2 DEBUG oslo_concurrency.processutils [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:22:03 compute-0 nova_compute[192698]: 2025-10-01 14:22:03.976 2 DEBUG oslo_concurrency.lockutils [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "f477473ce09fdc00484ca839f539813eb2fee546" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:22:03 compute-0 nova_compute[192698]: 2025-10-01 14:22:03.977 2 DEBUG oslo_concurrency.lockutils [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "f477473ce09fdc00484ca839f539813eb2fee546" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:22:03 compute-0 nova_compute[192698]: 2025-10-01 14:22:03.978 2 DEBUG oslo_utils.imageutils.format_inspector [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:22:03 compute-0 nova_compute[192698]: 2025-10-01 14:22:03.985 2 DEBUG oslo_utils.imageutils.format_inspector [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:22:03 compute-0 nova_compute[192698]: 2025-10-01 14:22:03.986 2 DEBUG oslo_concurrency.processutils [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:22:04 compute-0 nova_compute[192698]: 2025-10-01 14:22:04.063 2 DEBUG oslo_concurrency.processutils [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:22:04 compute-0 nova_compute[192698]: 2025-10-01 14:22:04.064 2 DEBUG oslo_concurrency.processutils [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546,backing_fmt=raw /var/lib/nova/instances/c11e5fa5-006c-4cf2-a3c6-8da08c4596c9/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:22:04 compute-0 nova_compute[192698]: 2025-10-01 14:22:04.101 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:22:04 compute-0 nova_compute[192698]: 2025-10-01 14:22:04.113 2 DEBUG oslo_concurrency.processutils [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546,backing_fmt=raw /var/lib/nova/instances/c11e5fa5-006c-4cf2-a3c6-8da08c4596c9/disk 1073741824" returned: 0 in 0.049s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:22:04 compute-0 nova_compute[192698]: 2025-10-01 14:22:04.114 2 DEBUG oslo_concurrency.lockutils [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "f477473ce09fdc00484ca839f539813eb2fee546" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.137s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:22:04 compute-0 nova_compute[192698]: 2025-10-01 14:22:04.115 2 DEBUG oslo_concurrency.processutils [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:22:04 compute-0 nova_compute[192698]: 2025-10-01 14:22:04.183 2 DEBUG oslo_concurrency.processutils [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:22:04 compute-0 nova_compute[192698]: 2025-10-01 14:22:04.185 2 DEBUG nova.virt.disk.api [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Checking if we can resize image /var/lib/nova/instances/c11e5fa5-006c-4cf2-a3c6-8da08c4596c9/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 01 14:22:04 compute-0 nova_compute[192698]: 2025-10-01 14:22:04.186 2 DEBUG oslo_concurrency.processutils [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c11e5fa5-006c-4cf2-a3c6-8da08c4596c9/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:22:04 compute-0 nova_compute[192698]: 2025-10-01 14:22:04.278 2 DEBUG oslo_concurrency.processutils [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c11e5fa5-006c-4cf2-a3c6-8da08c4596c9/disk --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:22:04 compute-0 nova_compute[192698]: 2025-10-01 14:22:04.280 2 DEBUG nova.virt.disk.api [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Cannot resize image /var/lib/nova/instances/c11e5fa5-006c-4cf2-a3c6-8da08c4596c9/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 01 14:22:04 compute-0 nova_compute[192698]: 2025-10-01 14:22:04.280 2 DEBUG nova.objects.instance [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lazy-loading 'migration_context' on Instance uuid c11e5fa5-006c-4cf2-a3c6-8da08c4596c9 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:22:04 compute-0 nova_compute[192698]: 2025-10-01 14:22:04.788 2 DEBUG nova.objects.base [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Object Instance<c11e5fa5-006c-4cf2-a3c6-8da08c4596c9> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Oct 01 14:22:04 compute-0 nova_compute[192698]: 2025-10-01 14:22:04.789 2 DEBUG oslo_concurrency.processutils [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/c11e5fa5-006c-4cf2-a3c6-8da08c4596c9/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:22:04 compute-0 nova_compute[192698]: 2025-10-01 14:22:04.829 2 DEBUG oslo_concurrency.processutils [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/c11e5fa5-006c-4cf2-a3c6-8da08c4596c9/disk.config 497664" returned: 0 in 0.039s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:22:04 compute-0 nova_compute[192698]: 2025-10-01 14:22:04.831 2 DEBUG nova.virt.libvirt.driver [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: c11e5fa5-006c-4cf2-a3c6-8da08c4596c9] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Oct 01 14:22:04 compute-0 nova_compute[192698]: 2025-10-01 14:22:04.833 2 DEBUG nova.virt.libvirt.vif [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-10-01T14:21:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1086313905',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1086313905',id=20,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-01T14:21:17Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d43115e3729442e1b68b749acc0dabc8',ramdisk_id='',reservation_id='r-dx5vuf34',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-30131345',owner_user_name='tempest-TestExecuteStrategies-30131345-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-10-01T14:21:17Z,user_data=None,user_id='f8897741e6ca4770b56d28d05fa3fc42',uuid=c11e5fa5-006c-4cf2-a3c6-8da08c4596c9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d274a684-bfdc-4a97-977d-654f7d54585b", "address": "fa:16:3e:54:70:aa", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapd274a684-bf", "ovs_interfaceid": "d274a684-bfdc-4a97-977d-654f7d54585b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 01 14:22:04 compute-0 nova_compute[192698]: 2025-10-01 14:22:04.834 2 DEBUG nova.network.os_vif_util [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Converting VIF {"id": "d274a684-bfdc-4a97-977d-654f7d54585b", "address": "fa:16:3e:54:70:aa", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapd274a684-bf", "ovs_interfaceid": "d274a684-bfdc-4a97-977d-654f7d54585b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:22:04 compute-0 nova_compute[192698]: 2025-10-01 14:22:04.835 2 DEBUG nova.network.os_vif_util [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:54:70:aa,bridge_name='br-int',has_traffic_filtering=True,id=d274a684-bfdc-4a97-977d-654f7d54585b,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd274a684-bf') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:22:04 compute-0 nova_compute[192698]: 2025-10-01 14:22:04.836 2 DEBUG os_vif [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:54:70:aa,bridge_name='br-int',has_traffic_filtering=True,id=d274a684-bfdc-4a97-977d-654f7d54585b,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd274a684-bf') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 01 14:22:04 compute-0 nova_compute[192698]: 2025-10-01 14:22:04.838 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:22:04 compute-0 nova_compute[192698]: 2025-10-01 14:22:04.838 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:22:04 compute-0 nova_compute[192698]: 2025-10-01 14:22:04.839 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:22:04 compute-0 nova_compute[192698]: 2025-10-01 14:22:04.841 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:22:04 compute-0 nova_compute[192698]: 2025-10-01 14:22:04.841 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '0b4c6596-6ee1-5041-8485-3b09bc13590c', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:22:04 compute-0 nova_compute[192698]: 2025-10-01 14:22:04.878 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:22:04 compute-0 nova_compute[192698]: 2025-10-01 14:22:04.882 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:22:04 compute-0 nova_compute[192698]: 2025-10-01 14:22:04.886 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:22:04 compute-0 nova_compute[192698]: 2025-10-01 14:22:04.887 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd274a684-bf, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:22:04 compute-0 nova_compute[192698]: 2025-10-01 14:22:04.888 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapd274a684-bf, col_values=(('qos', UUID('67224eb9-577c-410b-b364-874fb8931774')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:22:04 compute-0 nova_compute[192698]: 2025-10-01 14:22:04.888 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapd274a684-bf, col_values=(('external_ids', {'iface-id': 'd274a684-bfdc-4a97-977d-654f7d54585b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:54:70:aa', 'vm-uuid': 'c11e5fa5-006c-4cf2-a3c6-8da08c4596c9'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:22:04 compute-0 nova_compute[192698]: 2025-10-01 14:22:04.890 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:22:04 compute-0 NetworkManager[51741]: <info>  [1759328524.8917] manager: (tapd274a684-bf): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/68)
Oct 01 14:22:04 compute-0 nova_compute[192698]: 2025-10-01 14:22:04.894 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:22:04 compute-0 nova_compute[192698]: 2025-10-01 14:22:04.901 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:22:04 compute-0 nova_compute[192698]: 2025-10-01 14:22:04.902 2 INFO os_vif [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:54:70:aa,bridge_name='br-int',has_traffic_filtering=True,id=d274a684-bfdc-4a97-977d-654f7d54585b,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd274a684-bf')
Oct 01 14:22:04 compute-0 nova_compute[192698]: 2025-10-01 14:22:04.903 2 DEBUG nova.virt.libvirt.driver [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Oct 01 14:22:04 compute-0 nova_compute[192698]: 2025-10-01 14:22:04.904 2 DEBUG nova.compute.manager [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmprmgcxqib',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='c11e5fa5-006c-4cf2-a3c6-8da08c4596c9',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Oct 01 14:22:04 compute-0 nova_compute[192698]: 2025-10-01 14:22:04.905 2 WARNING neutronclient.v2_0.client [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:22:05 compute-0 nova_compute[192698]: 2025-10-01 14:22:05.302 2 WARNING neutronclient.v2_0.client [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:22:06 compute-0 nova_compute[192698]: 2025-10-01 14:22:06.343 2 DEBUG nova.network.neutron [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: c11e5fa5-006c-4cf2-a3c6-8da08c4596c9] Port d274a684-bfdc-4a97-977d-654f7d54585b updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Oct 01 14:22:06 compute-0 nova_compute[192698]: 2025-10-01 14:22:06.368 2 DEBUG nova.compute.manager [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmprmgcxqib',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='c11e5fa5-006c-4cf2-a3c6-8da08c4596c9',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Oct 01 14:22:06 compute-0 unix_chkpwd[223501]: password check failed for user (root)
Oct 01 14:22:06 compute-0 sshd-session[223499]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.244  user=root
Oct 01 14:22:08 compute-0 sshd-session[223499]: Failed password for root from 193.46.255.244 port 12798 ssh2
Oct 01 14:22:09 compute-0 nova_compute[192698]: 2025-10-01 14:22:09.105 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:22:09 compute-0 unix_chkpwd[223502]: password check failed for user (root)
Oct 01 14:22:09 compute-0 ovn_controller[94909]: 2025-10-01T14:22:09Z|00167|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Oct 01 14:22:09 compute-0 nova_compute[192698]: 2025-10-01 14:22:09.891 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:22:11 compute-0 sshd-session[223499]: Failed password for root from 193.46.255.244 port 12798 ssh2
Oct 01 14:22:11 compute-0 kernel: tapd274a684-bf: entered promiscuous mode
Oct 01 14:22:11 compute-0 NetworkManager[51741]: <info>  [1759328531.3043] manager: (tapd274a684-bf): new Tun device (/org/freedesktop/NetworkManager/Devices/69)
Oct 01 14:22:11 compute-0 ovn_controller[94909]: 2025-10-01T14:22:11Z|00168|binding|INFO|Claiming lport d274a684-bfdc-4a97-977d-654f7d54585b for this additional chassis.
Oct 01 14:22:11 compute-0 ovn_controller[94909]: 2025-10-01T14:22:11Z|00169|binding|INFO|d274a684-bfdc-4a97-977d-654f7d54585b: Claiming fa:16:3e:54:70:aa 10.100.0.6
Oct 01 14:22:11 compute-0 nova_compute[192698]: 2025-10-01 14:22:11.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:22:11 compute-0 ovn_controller[94909]: 2025-10-01T14:22:11Z|00170|binding|INFO|Setting lport d274a684-bfdc-4a97-977d-654f7d54585b ovn-installed in OVS
Oct 01 14:22:11 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:22:11.325 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:54:70:aa 10.100.0.6'], port_security=['fa:16:3e:54:70:aa 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'c11e5fa5-006c-4cf2-a3c6-8da08c4596c9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-031a8987-8430-4fb6-a464-01e4dca2fae7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd43115e3729442e1b68b749acc0dabc8', 'neutron:revision_number': '10', 'neutron:security_group_ids': '43a3232d-93b1-43af-a9a3-1fde49b4460d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bd1914da-f1b0-4097-9d6b-24a3870871dc, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=d274a684-bfdc-4a97-977d-654f7d54585b) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:22:11 compute-0 nova_compute[192698]: 2025-10-01 14:22:11.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:22:11 compute-0 nova_compute[192698]: 2025-10-01 14:22:11.325 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:22:11 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:22:11.328 103791 INFO neutron.agent.ovn.metadata.agent [-] Port d274a684-bfdc-4a97-977d-654f7d54585b in datapath 031a8987-8430-4fb6-a464-01e4dca2fae7 unbound from our chassis
Oct 01 14:22:11 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:22:11.329 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 031a8987-8430-4fb6-a464-01e4dca2fae7
Oct 01 14:22:11 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:22:11.355 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[321dae1e-c2fa-498d-8c9e-9eb1101deabd]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:22:11 compute-0 systemd-machined[152704]: New machine qemu-16-instance-00000014.
Oct 01 14:22:11 compute-0 systemd[1]: Started Virtual Machine qemu-16-instance-00000014.
Oct 01 14:22:11 compute-0 systemd-udevd[223521]: Network interface NamePolicy= disabled on kernel command line.
Oct 01 14:22:11 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:22:11.397 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[85e34b0b-62d5-44e2-978f-7f97a6ba6670]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:22:11 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:22:11.401 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[b0bd58a6-00a7-4988-9e65-cc5a4534194a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:22:11 compute-0 NetworkManager[51741]: <info>  [1759328531.4146] device (tapd274a684-bf): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 01 14:22:11 compute-0 NetworkManager[51741]: <info>  [1759328531.4158] device (tapd274a684-bf): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 01 14:22:11 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:22:11.449 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[61bfaea7-2bd9-474e-b2fe-fe647fcd58ca]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:22:11 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:22:11.475 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[1b8d6dcc-f6f6-4a14-9972-bd7257c11dcb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap031a8987-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:79:6c:81'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 486039, 'reachable_time': 19247, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223531, 'error': None, 'target': 'ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:22:11 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:22:11.504 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[261d1b60-642f-428b-9b32-e70dc00f5ea7]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap031a8987-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 486053, 'tstamp': 486053}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223533, 'error': None, 'target': 'ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap031a8987-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 486057, 'tstamp': 486057}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223533, 'error': None, 'target': 'ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:22:11 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:22:11.506 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap031a8987-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:22:11 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:22:11.510 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap031a8987-80, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:22:11 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:22:11.510 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:22:11 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:22:11.510 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap031a8987-80, col_values=(('external_ids', {'iface-id': '6dd814dc-cba2-4392-85ef-eadb8c4615f7'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:22:11 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:22:11.511 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:22:11 compute-0 nova_compute[192698]: 2025-10-01 14:22:11.508 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:22:11 compute-0 nova_compute[192698]: 2025-10-01 14:22:11.509 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:22:11 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:22:11.513 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[ca80e50f-322e-4dc9-adb3-b0c29b8da4ea]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-031a8987-8430-4fb6-a464-01e4dca2fae7\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 031a8987-8430-4fb6-a464-01e4dca2fae7\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:22:12 compute-0 unix_chkpwd[223541]: password check failed for user (root)
Oct 01 14:22:14 compute-0 nova_compute[192698]: 2025-10-01 14:22:14.108 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:22:14 compute-0 ovn_controller[94909]: 2025-10-01T14:22:14Z|00171|binding|INFO|Claiming lport d274a684-bfdc-4a97-977d-654f7d54585b for this chassis.
Oct 01 14:22:14 compute-0 ovn_controller[94909]: 2025-10-01T14:22:14Z|00172|binding|INFO|d274a684-bfdc-4a97-977d-654f7d54585b: Claiming fa:16:3e:54:70:aa 10.100.0.6
Oct 01 14:22:14 compute-0 ovn_controller[94909]: 2025-10-01T14:22:14Z|00173|binding|INFO|Setting lport d274a684-bfdc-4a97-977d-654f7d54585b up in Southbound
Oct 01 14:22:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:22:14.278 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:22:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:22:14.279 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:22:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:22:14.279 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:22:14 compute-0 sshd-session[223499]: Failed password for root from 193.46.255.244 port 12798 ssh2
Oct 01 14:22:14 compute-0 nova_compute[192698]: 2025-10-01 14:22:14.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:22:14 compute-0 sshd-session[223499]: Received disconnect from 193.46.255.244 port 12798:11:  [preauth]
Oct 01 14:22:14 compute-0 sshd-session[223499]: Disconnected from authenticating user root 193.46.255.244 port 12798 [preauth]
Oct 01 14:22:14 compute-0 sshd-session[223499]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.244  user=root
Oct 01 14:22:15 compute-0 podman[223553]: 2025-10-01 14:22:15.182060133 +0000 UTC m=+0.075664652 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2)
Oct 01 14:22:15 compute-0 nova_compute[192698]: 2025-10-01 14:22:15.229 2 INFO nova.compute.manager [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: c11e5fa5-006c-4cf2-a3c6-8da08c4596c9] Post operation of migration started
Oct 01 14:22:15 compute-0 nova_compute[192698]: 2025-10-01 14:22:15.230 2 WARNING neutronclient.v2_0.client [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:22:15 compute-0 podman[223556]: 2025-10-01 14:22:15.254442915 +0000 UTC m=+0.149274937 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Oct 01 14:22:15 compute-0 nova_compute[192698]: 2025-10-01 14:22:15.356 2 WARNING neutronclient.v2_0.client [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:22:15 compute-0 nova_compute[192698]: 2025-10-01 14:22:15.357 2 WARNING neutronclient.v2_0.client [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:22:15 compute-0 nova_compute[192698]: 2025-10-01 14:22:15.424 2 DEBUG oslo_concurrency.lockutils [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "refresh_cache-c11e5fa5-006c-4cf2-a3c6-8da08c4596c9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 01 14:22:15 compute-0 nova_compute[192698]: 2025-10-01 14:22:15.425 2 DEBUG oslo_concurrency.lockutils [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquired lock "refresh_cache-c11e5fa5-006c-4cf2-a3c6-8da08c4596c9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 01 14:22:15 compute-0 nova_compute[192698]: 2025-10-01 14:22:15.425 2 DEBUG nova.network.neutron [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: c11e5fa5-006c-4cf2-a3c6-8da08c4596c9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 01 14:22:15 compute-0 unix_chkpwd[223597]: password check failed for user (root)
Oct 01 14:22:15 compute-0 sshd-session[223554]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.244  user=root
Oct 01 14:22:15 compute-0 nova_compute[192698]: 2025-10-01 14:22:15.934 2 WARNING neutronclient.v2_0.client [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:22:17 compute-0 sshd-session[223554]: Failed password for root from 193.46.255.244 port 58042 ssh2
Oct 01 14:22:18 compute-0 nova_compute[192698]: 2025-10-01 14:22:18.344 2 WARNING neutronclient.v2_0.client [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:22:18 compute-0 unix_chkpwd[223598]: password check failed for user (root)
Oct 01 14:22:19 compute-0 nova_compute[192698]: 2025-10-01 14:22:19.110 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:22:19 compute-0 nova_compute[192698]: 2025-10-01 14:22:19.319 2 DEBUG nova.network.neutron [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: c11e5fa5-006c-4cf2-a3c6-8da08c4596c9] Updating instance_info_cache with network_info: [{"id": "d274a684-bfdc-4a97-977d-654f7d54585b", "address": "fa:16:3e:54:70:aa", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd274a684-bf", "ovs_interfaceid": "d274a684-bfdc-4a97-977d-654f7d54585b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:22:19 compute-0 nova_compute[192698]: 2025-10-01 14:22:19.832 2 DEBUG oslo_concurrency.lockutils [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Releasing lock "refresh_cache-c11e5fa5-006c-4cf2-a3c6-8da08c4596c9" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 01 14:22:19 compute-0 nova_compute[192698]: 2025-10-01 14:22:19.900 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:22:20 compute-0 nova_compute[192698]: 2025-10-01 14:22:20.354 2 DEBUG oslo_concurrency.lockutils [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:22:20 compute-0 nova_compute[192698]: 2025-10-01 14:22:20.355 2 DEBUG oslo_concurrency.lockutils [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:22:20 compute-0 nova_compute[192698]: 2025-10-01 14:22:20.355 2 DEBUG oslo_concurrency.lockutils [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:22:20 compute-0 nova_compute[192698]: 2025-10-01 14:22:20.364 2 INFO nova.virt.libvirt.driver [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: c11e5fa5-006c-4cf2-a3c6-8da08c4596c9] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Oct 01 14:22:20 compute-0 virtqemud[192597]: Domain id=16 name='instance-00000014' uuid=c11e5fa5-006c-4cf2-a3c6-8da08c4596c9 is tainted: custom-monitor
Oct 01 14:22:20 compute-0 sshd-session[223554]: Failed password for root from 193.46.255.244 port 58042 ssh2
Oct 01 14:22:20 compute-0 nova_compute[192698]: 2025-10-01 14:22:20.926 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:22:21 compute-0 nova_compute[192698]: 2025-10-01 14:22:21.375 2 INFO nova.virt.libvirt.driver [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: c11e5fa5-006c-4cf2-a3c6-8da08c4596c9] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Oct 01 14:22:21 compute-0 unix_chkpwd[223599]: password check failed for user (root)
Oct 01 14:22:21 compute-0 nova_compute[192698]: 2025-10-01 14:22:21.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:22:22 compute-0 podman[223600]: 2025-10-01 14:22:22.219003295 +0000 UTC m=+0.126740310 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, name=ubi9-minimal, vcs-type=git, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.openshift.expose-services=, maintainer=Red Hat, Inc.)
Oct 01 14:22:22 compute-0 nova_compute[192698]: 2025-10-01 14:22:22.384 2 INFO nova.virt.libvirt.driver [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: c11e5fa5-006c-4cf2-a3c6-8da08c4596c9] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Oct 01 14:22:22 compute-0 nova_compute[192698]: 2025-10-01 14:22:22.393 2 DEBUG nova.compute.manager [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: c11e5fa5-006c-4cf2-a3c6-8da08c4596c9] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 01 14:22:22 compute-0 nova_compute[192698]: 2025-10-01 14:22:22.439 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:22:22 compute-0 nova_compute[192698]: 2025-10-01 14:22:22.440 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:22:22 compute-0 nova_compute[192698]: 2025-10-01 14:22:22.441 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:22:22 compute-0 nova_compute[192698]: 2025-10-01 14:22:22.441 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 01 14:22:22 compute-0 nova_compute[192698]: 2025-10-01 14:22:22.904 2 DEBUG nova.objects.instance [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: c11e5fa5-006c-4cf2-a3c6-8da08c4596c9] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Oct 01 14:22:23 compute-0 sshd-session[223554]: Failed password for root from 193.46.255.244 port 58042 ssh2
Oct 01 14:22:23 compute-0 nova_compute[192698]: 2025-10-01 14:22:23.503 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c11e5fa5-006c-4cf2-a3c6-8da08c4596c9/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:22:23 compute-0 nova_compute[192698]: 2025-10-01 14:22:23.593 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c11e5fa5-006c-4cf2-a3c6-8da08c4596c9/disk --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:22:23 compute-0 nova_compute[192698]: 2025-10-01 14:22:23.595 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c11e5fa5-006c-4cf2-a3c6-8da08c4596c9/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:22:23 compute-0 nova_compute[192698]: 2025-10-01 14:22:23.680 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c11e5fa5-006c-4cf2-a3c6-8da08c4596c9/disk --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:22:23 compute-0 nova_compute[192698]: 2025-10-01 14:22:23.689 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/57466778-4bb3-4165-9a9f-bfca9f200d03/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:22:23 compute-0 nova_compute[192698]: 2025-10-01 14:22:23.771 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/57466778-4bb3-4165-9a9f-bfca9f200d03/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:22:23 compute-0 nova_compute[192698]: 2025-10-01 14:22:23.772 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/57466778-4bb3-4165-9a9f-bfca9f200d03/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:22:23 compute-0 nova_compute[192698]: 2025-10-01 14:22:23.862 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/57466778-4bb3-4165-9a9f-bfca9f200d03/disk --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:22:23 compute-0 nova_compute[192698]: 2025-10-01 14:22:23.934 2 WARNING neutronclient.v2_0.client [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:22:24 compute-0 nova_compute[192698]: 2025-10-01 14:22:24.112 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:22:24 compute-0 nova_compute[192698]: 2025-10-01 14:22:24.134 2 WARNING nova.virt.libvirt.driver [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:22:24 compute-0 nova_compute[192698]: 2025-10-01 14:22:24.135 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:22:24 compute-0 nova_compute[192698]: 2025-10-01 14:22:24.161 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.026s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:22:24 compute-0 nova_compute[192698]: 2025-10-01 14:22:24.162 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5530MB free_disk=73.24469375610352GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 01 14:22:24 compute-0 nova_compute[192698]: 2025-10-01 14:22:24.162 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:22:24 compute-0 nova_compute[192698]: 2025-10-01 14:22:24.163 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:22:24 compute-0 sshd-session[223554]: Received disconnect from 193.46.255.244 port 58042:11:  [preauth]
Oct 01 14:22:24 compute-0 sshd-session[223554]: Disconnected from authenticating user root 193.46.255.244 port 58042 [preauth]
Oct 01 14:22:24 compute-0 sshd-session[223554]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.244  user=root
Oct 01 14:22:24 compute-0 nova_compute[192698]: 2025-10-01 14:22:24.380 2 WARNING neutronclient.v2_0.client [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:22:24 compute-0 nova_compute[192698]: 2025-10-01 14:22:24.381 2 WARNING neutronclient.v2_0.client [None req-ed599c57-3689-4c2c-969e-82973e4bf959 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:22:24 compute-0 nova_compute[192698]: 2025-10-01 14:22:24.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:22:25 compute-0 unix_chkpwd[223638]: password check failed for user (root)
Oct 01 14:22:25 compute-0 sshd-session[223636]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.244  user=root
Oct 01 14:22:25 compute-0 nova_compute[192698]: 2025-10-01 14:22:25.185 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Applying migration context for instance c11e5fa5-006c-4cf2-a3c6-8da08c4596c9 as it has an incoming, in-progress migration fe6018d9-a8ce-40b4-adf0-2e80b662dc1d. Migration status is running _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1046
Oct 01 14:22:25 compute-0 nova_compute[192698]: 2025-10-01 14:22:25.186 2 DEBUG nova.objects.instance [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] [instance: c11e5fa5-006c-4cf2-a3c6-8da08c4596c9] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Oct 01 14:22:25 compute-0 nova_compute[192698]: 2025-10-01 14:22:25.694 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] [instance: c11e5fa5-006c-4cf2-a3c6-8da08c4596c9] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Oct 01 14:22:25 compute-0 nova_compute[192698]: 2025-10-01 14:22:25.735 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Instance 57466778-4bb3-4165-9a9f-bfca9f200d03 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Oct 01 14:22:25 compute-0 nova_compute[192698]: 2025-10-01 14:22:25.736 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Instance c11e5fa5-006c-4cf2-a3c6-8da08c4596c9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Oct 01 14:22:25 compute-0 nova_compute[192698]: 2025-10-01 14:22:25.737 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 01 14:22:25 compute-0 nova_compute[192698]: 2025-10-01 14:22:25.737 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:22:24 up  1:21,  0 user,  load average: 0.35, 0.26, 0.32\n', 'num_instances': '2', 'num_vm_active': '2', 'num_task_None': '2', 'num_os_type_None': '2', 'num_proj_d43115e3729442e1b68b749acc0dabc8': '2', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 01 14:22:25 compute-0 nova_compute[192698]: 2025-10-01 14:22:25.809 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:22:26 compute-0 nova_compute[192698]: 2025-10-01 14:22:26.319 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:22:26 compute-0 nova_compute[192698]: 2025-10-01 14:22:26.833 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 01 14:22:26 compute-0 nova_compute[192698]: 2025-10-01 14:22:26.834 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.671s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:22:27 compute-0 sshd-session[223636]: Failed password for root from 193.46.255.244 port 61556 ssh2
Oct 01 14:22:27 compute-0 unix_chkpwd[223639]: password check failed for user (root)
Oct 01 14:22:28 compute-0 podman[223640]: 2025-10-01 14:22:28.217883119 +0000 UTC m=+0.090803480 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct 01 14:22:28 compute-0 podman[223641]: 2025-10-01 14:22:28.224200589 +0000 UTC m=+0.092818254 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20250930)
Oct 01 14:22:29 compute-0 nova_compute[192698]: 2025-10-01 14:22:29.149 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:22:29 compute-0 podman[203144]: time="2025-10-01T14:22:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:22:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:22:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20750 "" "Go-http-client/1.1"
Oct 01 14:22:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:22:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3492 "" "Go-http-client/1.1"
Oct 01 14:22:29 compute-0 sshd-session[223636]: Failed password for root from 193.46.255.244 port 61556 ssh2
Oct 01 14:22:29 compute-0 nova_compute[192698]: 2025-10-01 14:22:29.836 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:22:29 compute-0 nova_compute[192698]: 2025-10-01 14:22:29.837 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:22:29 compute-0 nova_compute[192698]: 2025-10-01 14:22:29.839 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:22:29 compute-0 nova_compute[192698]: 2025-10-01 14:22:29.839 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:22:29 compute-0 nova_compute[192698]: 2025-10-01 14:22:29.906 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:22:30 compute-0 unix_chkpwd[223688]: password check failed for user (root)
Oct 01 14:22:30 compute-0 nova_compute[192698]: 2025-10-01 14:22:30.927 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:22:31 compute-0 nova_compute[192698]: 2025-10-01 14:22:31.074 2 DEBUG oslo_concurrency.lockutils [None req-ea094b00-02c5-4a85-b63b-e461319f4226 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Acquiring lock "57466778-4bb3-4165-9a9f-bfca9f200d03" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:22:31 compute-0 nova_compute[192698]: 2025-10-01 14:22:31.075 2 DEBUG oslo_concurrency.lockutils [None req-ea094b00-02c5-4a85-b63b-e461319f4226 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "57466778-4bb3-4165-9a9f-bfca9f200d03" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:22:31 compute-0 nova_compute[192698]: 2025-10-01 14:22:31.076 2 DEBUG oslo_concurrency.lockutils [None req-ea094b00-02c5-4a85-b63b-e461319f4226 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Acquiring lock "57466778-4bb3-4165-9a9f-bfca9f200d03-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:22:31 compute-0 nova_compute[192698]: 2025-10-01 14:22:31.076 2 DEBUG oslo_concurrency.lockutils [None req-ea094b00-02c5-4a85-b63b-e461319f4226 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "57466778-4bb3-4165-9a9f-bfca9f200d03-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:22:31 compute-0 nova_compute[192698]: 2025-10-01 14:22:31.077 2 DEBUG oslo_concurrency.lockutils [None req-ea094b00-02c5-4a85-b63b-e461319f4226 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "57466778-4bb3-4165-9a9f-bfca9f200d03-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:22:31 compute-0 nova_compute[192698]: 2025-10-01 14:22:31.096 2 INFO nova.compute.manager [None req-ea094b00-02c5-4a85-b63b-e461319f4226 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] Terminating instance
Oct 01 14:22:31 compute-0 openstack_network_exporter[205307]: ERROR   14:22:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:22:31 compute-0 openstack_network_exporter[205307]: ERROR   14:22:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:22:31 compute-0 openstack_network_exporter[205307]: ERROR   14:22:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:22:31 compute-0 openstack_network_exporter[205307]: ERROR   14:22:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:22:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:22:31 compute-0 openstack_network_exporter[205307]: ERROR   14:22:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:22:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:22:31 compute-0 nova_compute[192698]: 2025-10-01 14:22:31.616 2 DEBUG nova.compute.manager [None req-ea094b00-02c5-4a85-b63b-e461319f4226 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 01 14:22:31 compute-0 kernel: tapf7e77fe8-6f (unregistering): left promiscuous mode
Oct 01 14:22:31 compute-0 NetworkManager[51741]: <info>  [1759328551.6505] device (tapf7e77fe8-6f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 01 14:22:31 compute-0 ovn_controller[94909]: 2025-10-01T14:22:31Z|00174|binding|INFO|Releasing lport f7e77fe8-6f1f-498f-be00-420b8e788b70 from this chassis (sb_readonly=0)
Oct 01 14:22:31 compute-0 ovn_controller[94909]: 2025-10-01T14:22:31Z|00175|binding|INFO|Setting lport f7e77fe8-6f1f-498f-be00-420b8e788b70 down in Southbound
Oct 01 14:22:31 compute-0 nova_compute[192698]: 2025-10-01 14:22:31.670 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:22:31 compute-0 ovn_controller[94909]: 2025-10-01T14:22:31Z|00176|binding|INFO|Removing iface tapf7e77fe8-6f ovn-installed in OVS
Oct 01 14:22:31 compute-0 nova_compute[192698]: 2025-10-01 14:22:31.673 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:22:31 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:22:31.687 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:83:bd:d6 10.100.0.9'], port_security=['fa:16:3e:83:bd:d6 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '57466778-4bb3-4165-9a9f-bfca9f200d03', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-031a8987-8430-4fb6-a464-01e4dca2fae7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd43115e3729442e1b68b749acc0dabc8', 'neutron:revision_number': '5', 'neutron:security_group_ids': '43a3232d-93b1-43af-a9a3-1fde49b4460d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bd1914da-f1b0-4097-9d6b-24a3870871dc, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], logical_port=f7e77fe8-6f1f-498f-be00-420b8e788b70) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:22:31 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:22:31.688 103791 INFO neutron.agent.ovn.metadata.agent [-] Port f7e77fe8-6f1f-498f-be00-420b8e788b70 in datapath 031a8987-8430-4fb6-a464-01e4dca2fae7 unbound from our chassis
Oct 01 14:22:31 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:22:31.690 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 031a8987-8430-4fb6-a464-01e4dca2fae7
Oct 01 14:22:31 compute-0 nova_compute[192698]: 2025-10-01 14:22:31.696 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:22:31 compute-0 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000015.scope: Deactivated successfully.
Oct 01 14:22:31 compute-0 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000015.scope: Consumed 14.857s CPU time.
Oct 01 14:22:31 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:22:31.723 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[7f14ff17-7f7d-4eb7-b39c-4d863652757a]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:22:31 compute-0 systemd-machined[152704]: Machine qemu-15-instance-00000015 terminated.
Oct 01 14:22:31 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:22:31.767 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[61623477-7d74-4543-90f0-17e8a8db6055]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:22:31 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:22:31.771 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[a5dcc741-5fac-4a7f-9023-fbdd735fefa3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:22:31 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:22:31.803 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[48c1f27d-3583-4efe-8a2e-9ffb1d9ccb2a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:22:31 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:22:31.828 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[74641b2b-daa5-4f49-922c-1fe3a1ed17b1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap031a8987-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:79:6c:81'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 486039, 'reachable_time': 19247, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223701, 'error': None, 'target': 'ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:22:31 compute-0 nova_compute[192698]: 2025-10-01 14:22:31.847 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:22:31 compute-0 nova_compute[192698]: 2025-10-01 14:22:31.853 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:22:31 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:22:31.857 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[342e4aa9-855a-4eaf-84f1-ac38c73e569d]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap031a8987-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 486053, 'tstamp': 486053}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223704, 'error': None, 'target': 'ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap031a8987-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 486057, 'tstamp': 486057}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223704, 'error': None, 'target': 'ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:22:31 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:22:31.860 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap031a8987-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:22:31 compute-0 nova_compute[192698]: 2025-10-01 14:22:31.861 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:22:31 compute-0 nova_compute[192698]: 2025-10-01 14:22:31.869 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:22:31 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:22:31.871 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap031a8987-80, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:22:31 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:22:31.871 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:22:31 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:22:31.871 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap031a8987-80, col_values=(('external_ids', {'iface-id': '6dd814dc-cba2-4392-85ef-eadb8c4615f7'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:22:31 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:22:31.872 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:22:31 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:22:31.874 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[71cbd3b2-8562-4a71-b34b-e784ff2c0394]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-031a8987-8430-4fb6-a464-01e4dca2fae7\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 031a8987-8430-4fb6-a464-01e4dca2fae7\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:22:31 compute-0 nova_compute[192698]: 2025-10-01 14:22:31.893 2 INFO nova.virt.libvirt.driver [-] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] Instance destroyed successfully.
Oct 01 14:22:31 compute-0 nova_compute[192698]: 2025-10-01 14:22:31.894 2 DEBUG nova.objects.instance [None req-ea094b00-02c5-4a85-b63b-e461319f4226 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lazy-loading 'resources' on Instance uuid 57466778-4bb3-4165-9a9f-bfca9f200d03 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:22:32 compute-0 nova_compute[192698]: 2025-10-01 14:22:32.407 2 DEBUG nova.virt.libvirt.vif [None req-ea094b00-02c5-4a85-b63b-e461319f4226 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-10-01T14:21:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-678066720',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-678066720',id=21,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-01T14:21:41Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d43115e3729442e1b68b749acc0dabc8',ramdisk_id='',reservation_id='r-fih772z7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-30131345',owner_user_name='tempest-TestExecuteStrategies-30131345-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-01T14:21:41Z,user_data=None,user_id='f8897741e6ca4770b56d28d05fa3fc42',uuid=57466778-4bb3-4165-9a9f-bfca9f200d03,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f7e77fe8-6f1f-498f-be00-420b8e788b70", "address": "fa:16:3e:83:bd:d6", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7e77fe8-6f", "ovs_interfaceid": "f7e77fe8-6f1f-498f-be00-420b8e788b70", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 01 14:22:32 compute-0 nova_compute[192698]: 2025-10-01 14:22:32.408 2 DEBUG nova.network.os_vif_util [None req-ea094b00-02c5-4a85-b63b-e461319f4226 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Converting VIF {"id": "f7e77fe8-6f1f-498f-be00-420b8e788b70", "address": "fa:16:3e:83:bd:d6", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7e77fe8-6f", "ovs_interfaceid": "f7e77fe8-6f1f-498f-be00-420b8e788b70", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:22:32 compute-0 nova_compute[192698]: 2025-10-01 14:22:32.411 2 DEBUG nova.network.os_vif_util [None req-ea094b00-02c5-4a85-b63b-e461319f4226 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:83:bd:d6,bridge_name='br-int',has_traffic_filtering=True,id=f7e77fe8-6f1f-498f-be00-420b8e788b70,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7e77fe8-6f') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:22:32 compute-0 nova_compute[192698]: 2025-10-01 14:22:32.412 2 DEBUG os_vif [None req-ea094b00-02c5-4a85-b63b-e461319f4226 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:83:bd:d6,bridge_name='br-int',has_traffic_filtering=True,id=f7e77fe8-6f1f-498f-be00-420b8e788b70,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7e77fe8-6f') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 01 14:22:32 compute-0 nova_compute[192698]: 2025-10-01 14:22:32.417 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:22:32 compute-0 nova_compute[192698]: 2025-10-01 14:22:32.418 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf7e77fe8-6f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:22:32 compute-0 nova_compute[192698]: 2025-10-01 14:22:32.420 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:22:32 compute-0 nova_compute[192698]: 2025-10-01 14:22:32.424 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:22:32 compute-0 nova_compute[192698]: 2025-10-01 14:22:32.425 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:22:32 compute-0 nova_compute[192698]: 2025-10-01 14:22:32.426 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=8e2adb6b-7b46-4f29-9a9a-3161b6754467) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:22:32 compute-0 nova_compute[192698]: 2025-10-01 14:22:32.427 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:22:32 compute-0 nova_compute[192698]: 2025-10-01 14:22:32.428 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:22:32 compute-0 nova_compute[192698]: 2025-10-01 14:22:32.431 2 INFO os_vif [None req-ea094b00-02c5-4a85-b63b-e461319f4226 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:83:bd:d6,bridge_name='br-int',has_traffic_filtering=True,id=f7e77fe8-6f1f-498f-be00-420b8e788b70,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7e77fe8-6f')
Oct 01 14:22:32 compute-0 nova_compute[192698]: 2025-10-01 14:22:32.432 2 INFO nova.virt.libvirt.driver [None req-ea094b00-02c5-4a85-b63b-e461319f4226 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] Deleting instance files /var/lib/nova/instances/57466778-4bb3-4165-9a9f-bfca9f200d03_del
Oct 01 14:22:32 compute-0 nova_compute[192698]: 2025-10-01 14:22:32.433 2 INFO nova.virt.libvirt.driver [None req-ea094b00-02c5-4a85-b63b-e461319f4226 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] Deletion of /var/lib/nova/instances/57466778-4bb3-4165-9a9f-bfca9f200d03_del complete
Oct 01 14:22:32 compute-0 nova_compute[192698]: 2025-10-01 14:22:32.468 2 DEBUG nova.compute.manager [req-9abc7eb1-a14a-41c6-8303-0c5a37395a44 req-4ee4490a-4019-435c-b0de-eed904de6964 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] Received event network-vif-unplugged-f7e77fe8-6f1f-498f-be00-420b8e788b70 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:22:32 compute-0 nova_compute[192698]: 2025-10-01 14:22:32.469 2 DEBUG oslo_concurrency.lockutils [req-9abc7eb1-a14a-41c6-8303-0c5a37395a44 req-4ee4490a-4019-435c-b0de-eed904de6964 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "57466778-4bb3-4165-9a9f-bfca9f200d03-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:22:32 compute-0 nova_compute[192698]: 2025-10-01 14:22:32.469 2 DEBUG oslo_concurrency.lockutils [req-9abc7eb1-a14a-41c6-8303-0c5a37395a44 req-4ee4490a-4019-435c-b0de-eed904de6964 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "57466778-4bb3-4165-9a9f-bfca9f200d03-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:22:32 compute-0 nova_compute[192698]: 2025-10-01 14:22:32.469 2 DEBUG oslo_concurrency.lockutils [req-9abc7eb1-a14a-41c6-8303-0c5a37395a44 req-4ee4490a-4019-435c-b0de-eed904de6964 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "57466778-4bb3-4165-9a9f-bfca9f200d03-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:22:32 compute-0 nova_compute[192698]: 2025-10-01 14:22:32.470 2 DEBUG nova.compute.manager [req-9abc7eb1-a14a-41c6-8303-0c5a37395a44 req-4ee4490a-4019-435c-b0de-eed904de6964 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] No waiting events found dispatching network-vif-unplugged-f7e77fe8-6f1f-498f-be00-420b8e788b70 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:22:32 compute-0 nova_compute[192698]: 2025-10-01 14:22:32.471 2 DEBUG nova.compute.manager [req-9abc7eb1-a14a-41c6-8303-0c5a37395a44 req-4ee4490a-4019-435c-b0de-eed904de6964 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] Received event network-vif-unplugged-f7e77fe8-6f1f-498f-be00-420b8e788b70 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 01 14:22:32 compute-0 nova_compute[192698]: 2025-10-01 14:22:32.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:22:32 compute-0 nova_compute[192698]: 2025-10-01 14:22:32.925 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 01 14:22:32 compute-0 nova_compute[192698]: 2025-10-01 14:22:32.949 2 INFO nova.compute.manager [None req-ea094b00-02c5-4a85-b63b-e461319f4226 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] Took 1.33 seconds to destroy the instance on the hypervisor.
Oct 01 14:22:32 compute-0 nova_compute[192698]: 2025-10-01 14:22:32.950 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-ea094b00-02c5-4a85-b63b-e461319f4226 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 01 14:22:32 compute-0 nova_compute[192698]: 2025-10-01 14:22:32.950 2 DEBUG nova.compute.manager [-] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 01 14:22:32 compute-0 nova_compute[192698]: 2025-10-01 14:22:32.951 2 DEBUG nova.network.neutron [-] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 01 14:22:32 compute-0 nova_compute[192698]: 2025-10-01 14:22:32.951 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:22:32 compute-0 sshd-session[223636]: Failed password for root from 193.46.255.244 port 61556 ssh2
Oct 01 14:22:33 compute-0 nova_compute[192698]: 2025-10-01 14:22:33.368 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:22:33 compute-0 sshd-session[223636]: Received disconnect from 193.46.255.244 port 61556:11:  [preauth]
Oct 01 14:22:33 compute-0 sshd-session[223636]: Disconnected from authenticating user root 193.46.255.244 port 61556 [preauth]
Oct 01 14:22:33 compute-0 sshd-session[223636]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.244  user=root
Oct 01 14:22:33 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:22:33.600 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'e2:3f:3c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:1d:a6:67:ed:e6'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:22:33 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:22:33.601 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 01 14:22:33 compute-0 nova_compute[192698]: 2025-10-01 14:22:33.634 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:22:33 compute-0 nova_compute[192698]: 2025-10-01 14:22:33.705 2 DEBUG nova.compute.manager [req-18070d36-0d66-4c54-92c9-ae62e2f48248 req-ddce9734-7a18-45b1-a032-efca517186b0 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] Received event network-vif-deleted-f7e77fe8-6f1f-498f-be00-420b8e788b70 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:22:33 compute-0 nova_compute[192698]: 2025-10-01 14:22:33.706 2 INFO nova.compute.manager [req-18070d36-0d66-4c54-92c9-ae62e2f48248 req-ddce9734-7a18-45b1-a032-efca517186b0 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] Neutron deleted interface f7e77fe8-6f1f-498f-be00-420b8e788b70; detaching it from the instance and deleting it from the info cache
Oct 01 14:22:33 compute-0 nova_compute[192698]: 2025-10-01 14:22:33.706 2 DEBUG nova.network.neutron [req-18070d36-0d66-4c54-92c9-ae62e2f48248 req-ddce9734-7a18-45b1-a032-efca517186b0 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:22:34 compute-0 nova_compute[192698]: 2025-10-01 14:22:34.152 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:22:34 compute-0 nova_compute[192698]: 2025-10-01 14:22:34.157 2 DEBUG nova.network.neutron [-] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:22:34 compute-0 podman[223721]: 2025-10-01 14:22:34.179801218 +0000 UTC m=+0.086670089 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 01 14:22:34 compute-0 nova_compute[192698]: 2025-10-01 14:22:34.219 2 DEBUG nova.compute.manager [req-18070d36-0d66-4c54-92c9-ae62e2f48248 req-ddce9734-7a18-45b1-a032-efca517186b0 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] Detach interface failed, port_id=f7e77fe8-6f1f-498f-be00-420b8e788b70, reason: Instance 57466778-4bb3-4165-9a9f-bfca9f200d03 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Oct 01 14:22:34 compute-0 nova_compute[192698]: 2025-10-01 14:22:34.529 2 DEBUG nova.compute.manager [req-489c596d-37db-47b1-ac79-d58657f524d0 req-b3538812-09eb-4923-a3ae-324062716893 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] Received event network-vif-unplugged-f7e77fe8-6f1f-498f-be00-420b8e788b70 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:22:34 compute-0 nova_compute[192698]: 2025-10-01 14:22:34.529 2 DEBUG oslo_concurrency.lockutils [req-489c596d-37db-47b1-ac79-d58657f524d0 req-b3538812-09eb-4923-a3ae-324062716893 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "57466778-4bb3-4165-9a9f-bfca9f200d03-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:22:34 compute-0 nova_compute[192698]: 2025-10-01 14:22:34.530 2 DEBUG oslo_concurrency.lockutils [req-489c596d-37db-47b1-ac79-d58657f524d0 req-b3538812-09eb-4923-a3ae-324062716893 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "57466778-4bb3-4165-9a9f-bfca9f200d03-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:22:34 compute-0 nova_compute[192698]: 2025-10-01 14:22:34.530 2 DEBUG oslo_concurrency.lockutils [req-489c596d-37db-47b1-ac79-d58657f524d0 req-b3538812-09eb-4923-a3ae-324062716893 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "57466778-4bb3-4165-9a9f-bfca9f200d03-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:22:34 compute-0 nova_compute[192698]: 2025-10-01 14:22:34.531 2 DEBUG nova.compute.manager [req-489c596d-37db-47b1-ac79-d58657f524d0 req-b3538812-09eb-4923-a3ae-324062716893 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] No waiting events found dispatching network-vif-unplugged-f7e77fe8-6f1f-498f-be00-420b8e788b70 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:22:34 compute-0 nova_compute[192698]: 2025-10-01 14:22:34.531 2 DEBUG nova.compute.manager [req-489c596d-37db-47b1-ac79-d58657f524d0 req-b3538812-09eb-4923-a3ae-324062716893 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] Received event network-vif-unplugged-f7e77fe8-6f1f-498f-be00-420b8e788b70 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 01 14:22:34 compute-0 nova_compute[192698]: 2025-10-01 14:22:34.663 2 INFO nova.compute.manager [-] [instance: 57466778-4bb3-4165-9a9f-bfca9f200d03] Took 1.71 seconds to deallocate network for instance.
Oct 01 14:22:35 compute-0 nova_compute[192698]: 2025-10-01 14:22:35.191 2 DEBUG oslo_concurrency.lockutils [None req-ea094b00-02c5-4a85-b63b-e461319f4226 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:22:35 compute-0 nova_compute[192698]: 2025-10-01 14:22:35.191 2 DEBUG oslo_concurrency.lockutils [None req-ea094b00-02c5-4a85-b63b-e461319f4226 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:22:35 compute-0 nova_compute[192698]: 2025-10-01 14:22:35.284 2 DEBUG nova.compute.provider_tree [None req-ea094b00-02c5-4a85-b63b-e461319f4226 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:22:35 compute-0 nova_compute[192698]: 2025-10-01 14:22:35.792 2 DEBUG nova.scheduler.client.report [None req-ea094b00-02c5-4a85-b63b-e461319f4226 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:22:36 compute-0 nova_compute[192698]: 2025-10-01 14:22:36.303 2 DEBUG oslo_concurrency.lockutils [None req-ea094b00-02c5-4a85-b63b-e461319f4226 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.111s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:22:36 compute-0 nova_compute[192698]: 2025-10-01 14:22:36.332 2 INFO nova.scheduler.client.report [None req-ea094b00-02c5-4a85-b63b-e461319f4226 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Deleted allocations for instance 57466778-4bb3-4165-9a9f-bfca9f200d03
Oct 01 14:22:37 compute-0 nova_compute[192698]: 2025-10-01 14:22:37.359 2 DEBUG oslo_concurrency.lockutils [None req-ea094b00-02c5-4a85-b63b-e461319f4226 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "57466778-4bb3-4165-9a9f-bfca9f200d03" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.284s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:22:37 compute-0 nova_compute[192698]: 2025-10-01 14:22:37.428 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:22:38 compute-0 nova_compute[192698]: 2025-10-01 14:22:38.863 2 DEBUG oslo_concurrency.lockutils [None req-afca3164-bcfa-4cc9-8b51-ae40ac9e35c3 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Acquiring lock "c11e5fa5-006c-4cf2-a3c6-8da08c4596c9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:22:38 compute-0 nova_compute[192698]: 2025-10-01 14:22:38.864 2 DEBUG oslo_concurrency.lockutils [None req-afca3164-bcfa-4cc9-8b51-ae40ac9e35c3 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "c11e5fa5-006c-4cf2-a3c6-8da08c4596c9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:22:38 compute-0 nova_compute[192698]: 2025-10-01 14:22:38.865 2 DEBUG oslo_concurrency.lockutils [None req-afca3164-bcfa-4cc9-8b51-ae40ac9e35c3 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Acquiring lock "c11e5fa5-006c-4cf2-a3c6-8da08c4596c9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:22:38 compute-0 nova_compute[192698]: 2025-10-01 14:22:38.865 2 DEBUG oslo_concurrency.lockutils [None req-afca3164-bcfa-4cc9-8b51-ae40ac9e35c3 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "c11e5fa5-006c-4cf2-a3c6-8da08c4596c9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:22:38 compute-0 nova_compute[192698]: 2025-10-01 14:22:38.866 2 DEBUG oslo_concurrency.lockutils [None req-afca3164-bcfa-4cc9-8b51-ae40ac9e35c3 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "c11e5fa5-006c-4cf2-a3c6-8da08c4596c9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:22:38 compute-0 nova_compute[192698]: 2025-10-01 14:22:38.881 2 INFO nova.compute.manager [None req-afca3164-bcfa-4cc9-8b51-ae40ac9e35c3 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: c11e5fa5-006c-4cf2-a3c6-8da08c4596c9] Terminating instance
Oct 01 14:22:39 compute-0 nova_compute[192698]: 2025-10-01 14:22:39.154 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:22:39 compute-0 nova_compute[192698]: 2025-10-01 14:22:39.401 2 DEBUG nova.compute.manager [None req-afca3164-bcfa-4cc9-8b51-ae40ac9e35c3 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: c11e5fa5-006c-4cf2-a3c6-8da08c4596c9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 01 14:22:39 compute-0 kernel: tapd274a684-bf (unregistering): left promiscuous mode
Oct 01 14:22:39 compute-0 NetworkManager[51741]: <info>  [1759328559.4321] device (tapd274a684-bf): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 01 14:22:39 compute-0 nova_compute[192698]: 2025-10-01 14:22:39.448 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:22:39 compute-0 ovn_controller[94909]: 2025-10-01T14:22:39Z|00177|binding|INFO|Releasing lport d274a684-bfdc-4a97-977d-654f7d54585b from this chassis (sb_readonly=0)
Oct 01 14:22:39 compute-0 ovn_controller[94909]: 2025-10-01T14:22:39Z|00178|binding|INFO|Setting lport d274a684-bfdc-4a97-977d-654f7d54585b down in Southbound
Oct 01 14:22:39 compute-0 ovn_controller[94909]: 2025-10-01T14:22:39Z|00179|binding|INFO|Removing iface tapd274a684-bf ovn-installed in OVS
Oct 01 14:22:39 compute-0 nova_compute[192698]: 2025-10-01 14:22:39.450 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:22:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:22:39.456 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:54:70:aa 10.100.0.6'], port_security=['fa:16:3e:54:70:aa 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'c11e5fa5-006c-4cf2-a3c6-8da08c4596c9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-031a8987-8430-4fb6-a464-01e4dca2fae7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd43115e3729442e1b68b749acc0dabc8', 'neutron:revision_number': '15', 'neutron:security_group_ids': '43a3232d-93b1-43af-a9a3-1fde49b4460d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bd1914da-f1b0-4097-9d6b-24a3870871dc, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], logical_port=d274a684-bfdc-4a97-977d-654f7d54585b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:22:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:22:39.460 103791 INFO neutron.agent.ovn.metadata.agent [-] Port d274a684-bfdc-4a97-977d-654f7d54585b in datapath 031a8987-8430-4fb6-a464-01e4dca2fae7 unbound from our chassis
Oct 01 14:22:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:22:39.462 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 031a8987-8430-4fb6-a464-01e4dca2fae7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 01 14:22:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:22:39.463 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[74669304-3447-4ee9-9dac-d22121edb57f]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:22:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:22:39.464 103791 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7 namespace which is not needed anymore
Oct 01 14:22:39 compute-0 nova_compute[192698]: 2025-10-01 14:22:39.483 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:22:39 compute-0 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000014.scope: Deactivated successfully.
Oct 01 14:22:39 compute-0 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000014.scope: Consumed 2.888s CPU time.
Oct 01 14:22:39 compute-0 systemd-machined[152704]: Machine qemu-16-instance-00000014 terminated.
Oct 01 14:22:39 compute-0 nova_compute[192698]: 2025-10-01 14:22:39.578 2 DEBUG nova.compute.manager [req-173d0ecb-ee78-4b07-8a92-f4a05a6bc152 req-02173a66-e542-4ffe-bc9d-04855228517d 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: c11e5fa5-006c-4cf2-a3c6-8da08c4596c9] Received event network-vif-unplugged-d274a684-bfdc-4a97-977d-654f7d54585b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:22:39 compute-0 nova_compute[192698]: 2025-10-01 14:22:39.579 2 DEBUG oslo_concurrency.lockutils [req-173d0ecb-ee78-4b07-8a92-f4a05a6bc152 req-02173a66-e542-4ffe-bc9d-04855228517d 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "c11e5fa5-006c-4cf2-a3c6-8da08c4596c9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:22:39 compute-0 nova_compute[192698]: 2025-10-01 14:22:39.579 2 DEBUG oslo_concurrency.lockutils [req-173d0ecb-ee78-4b07-8a92-f4a05a6bc152 req-02173a66-e542-4ffe-bc9d-04855228517d 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "c11e5fa5-006c-4cf2-a3c6-8da08c4596c9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:22:39 compute-0 nova_compute[192698]: 2025-10-01 14:22:39.580 2 DEBUG oslo_concurrency.lockutils [req-173d0ecb-ee78-4b07-8a92-f4a05a6bc152 req-02173a66-e542-4ffe-bc9d-04855228517d 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "c11e5fa5-006c-4cf2-a3c6-8da08c4596c9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:22:39 compute-0 nova_compute[192698]: 2025-10-01 14:22:39.580 2 DEBUG nova.compute.manager [req-173d0ecb-ee78-4b07-8a92-f4a05a6bc152 req-02173a66-e542-4ffe-bc9d-04855228517d 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: c11e5fa5-006c-4cf2-a3c6-8da08c4596c9] No waiting events found dispatching network-vif-unplugged-d274a684-bfdc-4a97-977d-654f7d54585b pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:22:39 compute-0 nova_compute[192698]: 2025-10-01 14:22:39.581 2 DEBUG nova.compute.manager [req-173d0ecb-ee78-4b07-8a92-f4a05a6bc152 req-02173a66-e542-4ffe-bc9d-04855228517d 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: c11e5fa5-006c-4cf2-a3c6-8da08c4596c9] Received event network-vif-unplugged-d274a684-bfdc-4a97-977d-654f7d54585b for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 01 14:22:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:22:39.602 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=10cf9814-09fa-4bad-879a-270f9b64eda3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:22:39 compute-0 podman[223770]: 2025-10-01 14:22:39.675090653 +0000 UTC m=+0.061041537 container kill d743bff3e05e12ecdfbba4cf8ae265217b4712d1fc5cc252df4cf9426e90a1bc (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Oct 01 14:22:39 compute-0 neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7[223313]: [NOTICE]   (223317) : haproxy version is 3.0.5-8e879a5
Oct 01 14:22:39 compute-0 neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7[223313]: [NOTICE]   (223317) : path to executable is /usr/sbin/haproxy
Oct 01 14:22:39 compute-0 neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7[223313]: [WARNING]  (223317) : Exiting Master process...
Oct 01 14:22:39 compute-0 neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7[223313]: [ALERT]    (223317) : Current worker (223319) exited with code 143 (Terminated)
Oct 01 14:22:39 compute-0 neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7[223313]: [WARNING]  (223317) : All workers exited. Exiting... (0)
Oct 01 14:22:39 compute-0 systemd[1]: libpod-d743bff3e05e12ecdfbba4cf8ae265217b4712d1fc5cc252df4cf9426e90a1bc.scope: Deactivated successfully.
Oct 01 14:22:39 compute-0 nova_compute[192698]: 2025-10-01 14:22:39.696 2 INFO nova.virt.libvirt.driver [-] [instance: c11e5fa5-006c-4cf2-a3c6-8da08c4596c9] Instance destroyed successfully.
Oct 01 14:22:39 compute-0 nova_compute[192698]: 2025-10-01 14:22:39.697 2 DEBUG nova.objects.instance [None req-afca3164-bcfa-4cc9-8b51-ae40ac9e35c3 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lazy-loading 'resources' on Instance uuid c11e5fa5-006c-4cf2-a3c6-8da08c4596c9 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:22:39 compute-0 podman[223801]: 2025-10-01 14:22:39.737786824 +0000 UTC m=+0.036845865 container died d743bff3e05e12ecdfbba4cf8ae265217b4712d1fc5cc252df4cf9426e90a1bc (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7, org.label-schema.build-date=20250930, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Oct 01 14:22:39 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d743bff3e05e12ecdfbba4cf8ae265217b4712d1fc5cc252df4cf9426e90a1bc-userdata-shm.mount: Deactivated successfully.
Oct 01 14:22:39 compute-0 systemd[1]: var-lib-containers-storage-overlay-5d048d3162a6751c766848aa168ac8d4fba0b283f6091790a6d3d68d95ebafa4-merged.mount: Deactivated successfully.
Oct 01 14:22:39 compute-0 podman[223801]: 2025-10-01 14:22:39.787048213 +0000 UTC m=+0.086107224 container cleanup d743bff3e05e12ecdfbba4cf8ae265217b4712d1fc5cc252df4cf9426e90a1bc (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0)
Oct 01 14:22:39 compute-0 systemd[1]: libpod-conmon-d743bff3e05e12ecdfbba4cf8ae265217b4712d1fc5cc252df4cf9426e90a1bc.scope: Deactivated successfully.
Oct 01 14:22:39 compute-0 podman[223808]: 2025-10-01 14:22:39.80843692 +0000 UTC m=+0.079551017 container remove d743bff3e05e12ecdfbba4cf8ae265217b4712d1fc5cc252df4cf9426e90a1bc (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct 01 14:22:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:22:39.816 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[1ee5fcc2-90f1-4aad-93a4-657ed8d7905e]: (4, ("Wed Oct  1 02:22:39 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7 (d743bff3e05e12ecdfbba4cf8ae265217b4712d1fc5cc252df4cf9426e90a1bc)\nd743bff3e05e12ecdfbba4cf8ae265217b4712d1fc5cc252df4cf9426e90a1bc\nWed Oct  1 02:22:39 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7 (d743bff3e05e12ecdfbba4cf8ae265217b4712d1fc5cc252df4cf9426e90a1bc)\nd743bff3e05e12ecdfbba4cf8ae265217b4712d1fc5cc252df4cf9426e90a1bc\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:22:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:22:39.818 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[9262f656-8005-46ce-8b33-1d61ecbc09fc]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:22:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:22:39.818 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:22:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:22:39.819 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[415f4789-abc2-446d-80f5-9f24dd466ab3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:22:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:22:39.820 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap031a8987-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:22:39 compute-0 nova_compute[192698]: 2025-10-01 14:22:39.823 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:22:39 compute-0 kernel: tap031a8987-80: left promiscuous mode
Oct 01 14:22:39 compute-0 nova_compute[192698]: 2025-10-01 14:22:39.854 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:22:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:22:39.857 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[283d7504-fad5-4ad1-8842-958d07129257]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:22:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:22:39.894 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[974f1a7a-520e-47ac-b6f2-b5f5e681e4df]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:22:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:22:39.896 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[20ab23f0-1d89-4022-8401-cefe2f761d84]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:22:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:22:39.918 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[fc52cd81-b4b8-45eb-9756-b35c896914e2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 486031, 'reachable_time': 18778, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223836, 'error': None, 'target': 'ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:22:39 compute-0 systemd[1]: run-netns-ovnmeta\x2d031a8987\x2d8430\x2d4fb6\x2da464\x2d01e4dca2fae7.mount: Deactivated successfully.
Oct 01 14:22:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:22:39.924 103910 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Oct 01 14:22:39 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:22:39.925 103910 DEBUG oslo.privsep.daemon [-] privsep: reply[42eb4270-3cf5-4b3a-beb0-84f9f757bd0a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:22:40 compute-0 nova_compute[192698]: 2025-10-01 14:22:40.207 2 DEBUG nova.virt.libvirt.vif [None req-afca3164-bcfa-4cc9-8b51-ae40ac9e35c3 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-10-01T14:21:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1086313905',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1086313905',id=20,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-01T14:21:17Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d43115e3729442e1b68b749acc0dabc8',ramdisk_id='',reservation_id='r-dx5vuf34',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',clean_attempts='1',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-30131345',owner_user_name='tempest-TestExecuteStrategies-30131345-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-01T14:22:23Z,user_data=None,user_id='f8897741e6ca4770b56d28d05fa3fc42',uuid=c11e5fa5-006c-4cf2-a3c6-8da08c4596c9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d274a684-bfdc-4a97-977d-654f7d54585b", "address": "fa:16:3e:54:70:aa", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd274a684-bf", "ovs_interfaceid": "d274a684-bfdc-4a97-977d-654f7d54585b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 01 14:22:40 compute-0 nova_compute[192698]: 2025-10-01 14:22:40.207 2 DEBUG nova.network.os_vif_util [None req-afca3164-bcfa-4cc9-8b51-ae40ac9e35c3 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Converting VIF {"id": "d274a684-bfdc-4a97-977d-654f7d54585b", "address": "fa:16:3e:54:70:aa", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd274a684-bf", "ovs_interfaceid": "d274a684-bfdc-4a97-977d-654f7d54585b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:22:40 compute-0 nova_compute[192698]: 2025-10-01 14:22:40.209 2 DEBUG nova.network.os_vif_util [None req-afca3164-bcfa-4cc9-8b51-ae40ac9e35c3 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:54:70:aa,bridge_name='br-int',has_traffic_filtering=True,id=d274a684-bfdc-4a97-977d-654f7d54585b,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd274a684-bf') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:22:40 compute-0 nova_compute[192698]: 2025-10-01 14:22:40.209 2 DEBUG os_vif [None req-afca3164-bcfa-4cc9-8b51-ae40ac9e35c3 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:54:70:aa,bridge_name='br-int',has_traffic_filtering=True,id=d274a684-bfdc-4a97-977d-654f7d54585b,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd274a684-bf') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 01 14:22:40 compute-0 nova_compute[192698]: 2025-10-01 14:22:40.211 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:22:40 compute-0 nova_compute[192698]: 2025-10-01 14:22:40.212 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd274a684-bf, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:22:40 compute-0 nova_compute[192698]: 2025-10-01 14:22:40.257 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:22:40 compute-0 nova_compute[192698]: 2025-10-01 14:22:40.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:22:40 compute-0 nova_compute[192698]: 2025-10-01 14:22:40.260 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:22:40 compute-0 nova_compute[192698]: 2025-10-01 14:22:40.261 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=67224eb9-577c-410b-b364-874fb8931774) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:22:40 compute-0 nova_compute[192698]: 2025-10-01 14:22:40.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:22:40 compute-0 nova_compute[192698]: 2025-10-01 14:22:40.263 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:22:40 compute-0 nova_compute[192698]: 2025-10-01 14:22:40.267 2 INFO os_vif [None req-afca3164-bcfa-4cc9-8b51-ae40ac9e35c3 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:54:70:aa,bridge_name='br-int',has_traffic_filtering=True,id=d274a684-bfdc-4a97-977d-654f7d54585b,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd274a684-bf')
Oct 01 14:22:40 compute-0 nova_compute[192698]: 2025-10-01 14:22:40.268 2 INFO nova.virt.libvirt.driver [None req-afca3164-bcfa-4cc9-8b51-ae40ac9e35c3 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: c11e5fa5-006c-4cf2-a3c6-8da08c4596c9] Deleting instance files /var/lib/nova/instances/c11e5fa5-006c-4cf2-a3c6-8da08c4596c9_del
Oct 01 14:22:40 compute-0 nova_compute[192698]: 2025-10-01 14:22:40.268 2 INFO nova.virt.libvirt.driver [None req-afca3164-bcfa-4cc9-8b51-ae40ac9e35c3 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: c11e5fa5-006c-4cf2-a3c6-8da08c4596c9] Deletion of /var/lib/nova/instances/c11e5fa5-006c-4cf2-a3c6-8da08c4596c9_del complete
Oct 01 14:22:40 compute-0 nova_compute[192698]: 2025-10-01 14:22:40.783 2 INFO nova.compute.manager [None req-afca3164-bcfa-4cc9-8b51-ae40ac9e35c3 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: c11e5fa5-006c-4cf2-a3c6-8da08c4596c9] Took 1.38 seconds to destroy the instance on the hypervisor.
Oct 01 14:22:40 compute-0 nova_compute[192698]: 2025-10-01 14:22:40.783 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-afca3164-bcfa-4cc9-8b51-ae40ac9e35c3 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 01 14:22:40 compute-0 nova_compute[192698]: 2025-10-01 14:22:40.784 2 DEBUG nova.compute.manager [-] [instance: c11e5fa5-006c-4cf2-a3c6-8da08c4596c9] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 01 14:22:40 compute-0 nova_compute[192698]: 2025-10-01 14:22:40.784 2 DEBUG nova.network.neutron [-] [instance: c11e5fa5-006c-4cf2-a3c6-8da08c4596c9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 01 14:22:40 compute-0 nova_compute[192698]: 2025-10-01 14:22:40.784 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:22:40 compute-0 nova_compute[192698]: 2025-10-01 14:22:40.910 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:22:41 compute-0 nova_compute[192698]: 2025-10-01 14:22:41.667 2 DEBUG nova.compute.manager [req-e694c7a4-262b-45c7-b0d5-79ebd726a37a req-ce58576b-1af8-4818-b9c2-45cd0cfcf651 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: c11e5fa5-006c-4cf2-a3c6-8da08c4596c9] Received event network-vif-unplugged-d274a684-bfdc-4a97-977d-654f7d54585b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:22:41 compute-0 nova_compute[192698]: 2025-10-01 14:22:41.668 2 DEBUG oslo_concurrency.lockutils [req-e694c7a4-262b-45c7-b0d5-79ebd726a37a req-ce58576b-1af8-4818-b9c2-45cd0cfcf651 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "c11e5fa5-006c-4cf2-a3c6-8da08c4596c9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:22:41 compute-0 nova_compute[192698]: 2025-10-01 14:22:41.668 2 DEBUG oslo_concurrency.lockutils [req-e694c7a4-262b-45c7-b0d5-79ebd726a37a req-ce58576b-1af8-4818-b9c2-45cd0cfcf651 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "c11e5fa5-006c-4cf2-a3c6-8da08c4596c9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:22:41 compute-0 nova_compute[192698]: 2025-10-01 14:22:41.669 2 DEBUG oslo_concurrency.lockutils [req-e694c7a4-262b-45c7-b0d5-79ebd726a37a req-ce58576b-1af8-4818-b9c2-45cd0cfcf651 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "c11e5fa5-006c-4cf2-a3c6-8da08c4596c9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:22:41 compute-0 nova_compute[192698]: 2025-10-01 14:22:41.669 2 DEBUG nova.compute.manager [req-e694c7a4-262b-45c7-b0d5-79ebd726a37a req-ce58576b-1af8-4818-b9c2-45cd0cfcf651 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: c11e5fa5-006c-4cf2-a3c6-8da08c4596c9] No waiting events found dispatching network-vif-unplugged-d274a684-bfdc-4a97-977d-654f7d54585b pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:22:41 compute-0 nova_compute[192698]: 2025-10-01 14:22:41.670 2 DEBUG nova.compute.manager [req-e694c7a4-262b-45c7-b0d5-79ebd726a37a req-ce58576b-1af8-4818-b9c2-45cd0cfcf651 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: c11e5fa5-006c-4cf2-a3c6-8da08c4596c9] Received event network-vif-unplugged-d274a684-bfdc-4a97-977d-654f7d54585b for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 01 14:22:41 compute-0 nova_compute[192698]: 2025-10-01 14:22:41.670 2 DEBUG nova.compute.manager [req-e694c7a4-262b-45c7-b0d5-79ebd726a37a req-ce58576b-1af8-4818-b9c2-45cd0cfcf651 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: c11e5fa5-006c-4cf2-a3c6-8da08c4596c9] Received event network-vif-deleted-d274a684-bfdc-4a97-977d-654f7d54585b external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:22:41 compute-0 nova_compute[192698]: 2025-10-01 14:22:41.670 2 INFO nova.compute.manager [req-e694c7a4-262b-45c7-b0d5-79ebd726a37a req-ce58576b-1af8-4818-b9c2-45cd0cfcf651 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: c11e5fa5-006c-4cf2-a3c6-8da08c4596c9] Neutron deleted interface d274a684-bfdc-4a97-977d-654f7d54585b; detaching it from the instance and deleting it from the info cache
Oct 01 14:22:41 compute-0 nova_compute[192698]: 2025-10-01 14:22:41.671 2 DEBUG nova.network.neutron [req-e694c7a4-262b-45c7-b0d5-79ebd726a37a req-ce58576b-1af8-4818-b9c2-45cd0cfcf651 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: c11e5fa5-006c-4cf2-a3c6-8da08c4596c9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:22:41 compute-0 nova_compute[192698]: 2025-10-01 14:22:41.673 2 DEBUG nova.network.neutron [-] [instance: c11e5fa5-006c-4cf2-a3c6-8da08c4596c9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:22:42 compute-0 nova_compute[192698]: 2025-10-01 14:22:42.180 2 INFO nova.compute.manager [-] [instance: c11e5fa5-006c-4cf2-a3c6-8da08c4596c9] Took 1.40 seconds to deallocate network for instance.
Oct 01 14:22:42 compute-0 nova_compute[192698]: 2025-10-01 14:22:42.185 2 DEBUG nova.compute.manager [req-e694c7a4-262b-45c7-b0d5-79ebd726a37a req-ce58576b-1af8-4818-b9c2-45cd0cfcf651 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: c11e5fa5-006c-4cf2-a3c6-8da08c4596c9] Detach interface failed, port_id=d274a684-bfdc-4a97-977d-654f7d54585b, reason: Instance c11e5fa5-006c-4cf2-a3c6-8da08c4596c9 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Oct 01 14:22:42 compute-0 nova_compute[192698]: 2025-10-01 14:22:42.711 2 DEBUG oslo_concurrency.lockutils [None req-afca3164-bcfa-4cc9-8b51-ae40ac9e35c3 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:22:42 compute-0 nova_compute[192698]: 2025-10-01 14:22:42.712 2 DEBUG oslo_concurrency.lockutils [None req-afca3164-bcfa-4cc9-8b51-ae40ac9e35c3 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:22:42 compute-0 nova_compute[192698]: 2025-10-01 14:22:42.776 2 DEBUG nova.compute.provider_tree [None req-afca3164-bcfa-4cc9-8b51-ae40ac9e35c3 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:22:43 compute-0 nova_compute[192698]: 2025-10-01 14:22:43.285 2 DEBUG nova.scheduler.client.report [None req-afca3164-bcfa-4cc9-8b51-ae40ac9e35c3 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:22:43 compute-0 nova_compute[192698]: 2025-10-01 14:22:43.795 2 DEBUG oslo_concurrency.lockutils [None req-afca3164-bcfa-4cc9-8b51-ae40ac9e35c3 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.083s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:22:43 compute-0 nova_compute[192698]: 2025-10-01 14:22:43.822 2 INFO nova.scheduler.client.report [None req-afca3164-bcfa-4cc9-8b51-ae40ac9e35c3 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Deleted allocations for instance c11e5fa5-006c-4cf2-a3c6-8da08c4596c9
Oct 01 14:22:44 compute-0 nova_compute[192698]: 2025-10-01 14:22:44.155 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:22:44 compute-0 nova_compute[192698]: 2025-10-01 14:22:44.869 2 DEBUG oslo_concurrency.lockutils [None req-afca3164-bcfa-4cc9-8b51-ae40ac9e35c3 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "c11e5fa5-006c-4cf2-a3c6-8da08c4596c9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.005s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:22:45 compute-0 nova_compute[192698]: 2025-10-01 14:22:45.263 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:22:46 compute-0 podman[223839]: 2025-10-01 14:22:46.192230256 +0000 UTC m=+0.096150742 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, managed_by=edpm_ansible, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 01 14:22:46 compute-0 podman[223840]: 2025-10-01 14:22:46.250695423 +0000 UTC m=+0.151648319 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.4)
Oct 01 14:22:49 compute-0 nova_compute[192698]: 2025-10-01 14:22:49.157 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:22:50 compute-0 nova_compute[192698]: 2025-10-01 14:22:50.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:22:53 compute-0 podman[223884]: 2025-10-01 14:22:53.161046182 +0000 UTC m=+0.082039173 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., name=ubi9-minimal, config_id=edpm, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc.)
Oct 01 14:22:54 compute-0 nova_compute[192698]: 2025-10-01 14:22:54.185 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:22:55 compute-0 nova_compute[192698]: 2025-10-01 14:22:55.269 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:22:59 compute-0 podman[223905]: 2025-10-01 14:22:59.185179939 +0000 UTC m=+0.091728065 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=iscsid, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct 01 14:22:59 compute-0 nova_compute[192698]: 2025-10-01 14:22:59.188 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:22:59 compute-0 podman[223906]: 2025-10-01 14:22:59.195527288 +0000 UTC m=+0.096839753 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest)
Oct 01 14:22:59 compute-0 podman[203144]: time="2025-10-01T14:22:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:22:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:22:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:22:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:22:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3031 "" "Go-http-client/1.1"
Oct 01 14:23:00 compute-0 nova_compute[192698]: 2025-10-01 14:23:00.271 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:23:01 compute-0 openstack_network_exporter[205307]: ERROR   14:23:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:23:01 compute-0 openstack_network_exporter[205307]: ERROR   14:23:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:23:01 compute-0 openstack_network_exporter[205307]: ERROR   14:23:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:23:01 compute-0 openstack_network_exporter[205307]: ERROR   14:23:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:23:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:23:01 compute-0 openstack_network_exporter[205307]: ERROR   14:23:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:23:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:23:04 compute-0 nova_compute[192698]: 2025-10-01 14:23:04.191 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:23:05 compute-0 podman[223946]: 2025-10-01 14:23:05.178761833 +0000 UTC m=+0.088262681 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 01 14:23:05 compute-0 nova_compute[192698]: 2025-10-01 14:23:05.274 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:23:08 compute-0 nova_compute[192698]: 2025-10-01 14:23:08.824 2 DEBUG oslo_concurrency.lockutils [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Acquiring lock "a8a45d3f-7256-468b-a779-ce1dd6daedd7" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:23:08 compute-0 nova_compute[192698]: 2025-10-01 14:23:08.824 2 DEBUG oslo_concurrency.lockutils [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "a8a45d3f-7256-468b-a779-ce1dd6daedd7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:23:09 compute-0 nova_compute[192698]: 2025-10-01 14:23:09.234 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:23:09 compute-0 nova_compute[192698]: 2025-10-01 14:23:09.332 2 DEBUG nova.compute.manager [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Oct 01 14:23:09 compute-0 nova_compute[192698]: 2025-10-01 14:23:09.875 2 DEBUG oslo_concurrency.lockutils [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:23:09 compute-0 nova_compute[192698]: 2025-10-01 14:23:09.875 2 DEBUG oslo_concurrency.lockutils [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:23:09 compute-0 nova_compute[192698]: 2025-10-01 14:23:09.883 2 DEBUG nova.virt.hardware [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Oct 01 14:23:09 compute-0 nova_compute[192698]: 2025-10-01 14:23:09.883 2 INFO nova.compute.claims [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Claim successful on node compute-0.ctlplane.example.com
Oct 01 14:23:10 compute-0 nova_compute[192698]: 2025-10-01 14:23:10.321 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:23:10 compute-0 nova_compute[192698]: 2025-10-01 14:23:10.951 2 DEBUG nova.compute.provider_tree [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:23:11 compute-0 nova_compute[192698]: 2025-10-01 14:23:11.462 2 DEBUG nova.scheduler.client.report [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:23:11 compute-0 nova_compute[192698]: 2025-10-01 14:23:11.975 2 DEBUG oslo_concurrency.lockutils [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.099s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:23:11 compute-0 nova_compute[192698]: 2025-10-01 14:23:11.976 2 DEBUG nova.compute.manager [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Oct 01 14:23:12 compute-0 nova_compute[192698]: 2025-10-01 14:23:12.487 2 DEBUG nova.compute.manager [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Oct 01 14:23:12 compute-0 nova_compute[192698]: 2025-10-01 14:23:12.488 2 DEBUG nova.network.neutron [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Oct 01 14:23:12 compute-0 nova_compute[192698]: 2025-10-01 14:23:12.488 2 WARNING neutronclient.v2_0.client [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:23:12 compute-0 nova_compute[192698]: 2025-10-01 14:23:12.489 2 WARNING neutronclient.v2_0.client [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:23:12 compute-0 nova_compute[192698]: 2025-10-01 14:23:12.997 2 INFO nova.virt.libvirt.driver [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 01 14:23:13 compute-0 nova_compute[192698]: 2025-10-01 14:23:13.507 2 DEBUG nova.compute.manager [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Oct 01 14:23:14 compute-0 nova_compute[192698]: 2025-10-01 14:23:14.239 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:23:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:23:14.280 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:23:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:23:14.280 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:23:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:23:14.281 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:23:14 compute-0 nova_compute[192698]: 2025-10-01 14:23:14.524 2 DEBUG nova.compute.manager [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Oct 01 14:23:14 compute-0 nova_compute[192698]: 2025-10-01 14:23:14.526 2 DEBUG nova.virt.libvirt.driver [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Oct 01 14:23:14 compute-0 nova_compute[192698]: 2025-10-01 14:23:14.527 2 INFO nova.virt.libvirt.driver [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Creating image(s)
Oct 01 14:23:14 compute-0 nova_compute[192698]: 2025-10-01 14:23:14.527 2 DEBUG oslo_concurrency.lockutils [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Acquiring lock "/var/lib/nova/instances/a8a45d3f-7256-468b-a779-ce1dd6daedd7/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:23:14 compute-0 nova_compute[192698]: 2025-10-01 14:23:14.528 2 DEBUG oslo_concurrency.lockutils [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "/var/lib/nova/instances/a8a45d3f-7256-468b-a779-ce1dd6daedd7/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:23:14 compute-0 nova_compute[192698]: 2025-10-01 14:23:14.529 2 DEBUG oslo_concurrency.lockutils [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "/var/lib/nova/instances/a8a45d3f-7256-468b-a779-ce1dd6daedd7/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:23:14 compute-0 nova_compute[192698]: 2025-10-01 14:23:14.530 2 DEBUG oslo_utils.imageutils.format_inspector [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:23:14 compute-0 nova_compute[192698]: 2025-10-01 14:23:14.536 2 DEBUG oslo_utils.imageutils.format_inspector [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:23:14 compute-0 nova_compute[192698]: 2025-10-01 14:23:14.538 2 DEBUG oslo_concurrency.processutils [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:23:14 compute-0 nova_compute[192698]: 2025-10-01 14:23:14.640 2 DEBUG oslo_concurrency.processutils [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.102s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:23:14 compute-0 nova_compute[192698]: 2025-10-01 14:23:14.642 2 DEBUG oslo_concurrency.lockutils [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Acquiring lock "f477473ce09fdc00484ca839f539813eb2fee546" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:23:14 compute-0 nova_compute[192698]: 2025-10-01 14:23:14.643 2 DEBUG oslo_concurrency.lockutils [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "f477473ce09fdc00484ca839f539813eb2fee546" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:23:14 compute-0 nova_compute[192698]: 2025-10-01 14:23:14.644 2 DEBUG oslo_utils.imageutils.format_inspector [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:23:14 compute-0 nova_compute[192698]: 2025-10-01 14:23:14.650 2 DEBUG oslo_utils.imageutils.format_inspector [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:23:14 compute-0 nova_compute[192698]: 2025-10-01 14:23:14.651 2 DEBUG oslo_concurrency.processutils [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:23:14 compute-0 nova_compute[192698]: 2025-10-01 14:23:14.732 2 DEBUG oslo_concurrency.processutils [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:23:14 compute-0 nova_compute[192698]: 2025-10-01 14:23:14.733 2 DEBUG oslo_concurrency.processutils [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546,backing_fmt=raw /var/lib/nova/instances/a8a45d3f-7256-468b-a779-ce1dd6daedd7/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:23:14 compute-0 nova_compute[192698]: 2025-10-01 14:23:14.748 2 DEBUG nova.network.neutron [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Successfully created port: e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Oct 01 14:23:14 compute-0 nova_compute[192698]: 2025-10-01 14:23:14.790 2 DEBUG oslo_concurrency.processutils [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546,backing_fmt=raw /var/lib/nova/instances/a8a45d3f-7256-468b-a779-ce1dd6daedd7/disk 1073741824" returned: 0 in 0.057s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:23:14 compute-0 nova_compute[192698]: 2025-10-01 14:23:14.791 2 DEBUG oslo_concurrency.lockutils [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "f477473ce09fdc00484ca839f539813eb2fee546" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.149s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:23:14 compute-0 nova_compute[192698]: 2025-10-01 14:23:14.792 2 DEBUG oslo_concurrency.processutils [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:23:14 compute-0 nova_compute[192698]: 2025-10-01 14:23:14.877 2 DEBUG oslo_concurrency.processutils [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:23:14 compute-0 nova_compute[192698]: 2025-10-01 14:23:14.879 2 DEBUG nova.virt.disk.api [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Checking if we can resize image /var/lib/nova/instances/a8a45d3f-7256-468b-a779-ce1dd6daedd7/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 01 14:23:14 compute-0 nova_compute[192698]: 2025-10-01 14:23:14.880 2 DEBUG oslo_concurrency.processutils [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a8a45d3f-7256-468b-a779-ce1dd6daedd7/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:23:14 compute-0 nova_compute[192698]: 2025-10-01 14:23:14.967 2 DEBUG oslo_concurrency.processutils [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a8a45d3f-7256-468b-a779-ce1dd6daedd7/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:23:14 compute-0 nova_compute[192698]: 2025-10-01 14:23:14.968 2 DEBUG nova.virt.disk.api [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Cannot resize image /var/lib/nova/instances/a8a45d3f-7256-468b-a779-ce1dd6daedd7/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 01 14:23:14 compute-0 nova_compute[192698]: 2025-10-01 14:23:14.969 2 DEBUG nova.virt.libvirt.driver [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Oct 01 14:23:14 compute-0 nova_compute[192698]: 2025-10-01 14:23:14.970 2 DEBUG nova.virt.libvirt.driver [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Ensure instance console log exists: /var/lib/nova/instances/a8a45d3f-7256-468b-a779-ce1dd6daedd7/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Oct 01 14:23:14 compute-0 nova_compute[192698]: 2025-10-01 14:23:14.970 2 DEBUG oslo_concurrency.lockutils [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:23:14 compute-0 nova_compute[192698]: 2025-10-01 14:23:14.971 2 DEBUG oslo_concurrency.lockutils [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:23:14 compute-0 nova_compute[192698]: 2025-10-01 14:23:14.972 2 DEBUG oslo_concurrency.lockutils [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:23:15 compute-0 nova_compute[192698]: 2025-10-01 14:23:15.323 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:23:15 compute-0 nova_compute[192698]: 2025-10-01 14:23:15.699 2 DEBUG nova.network.neutron [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Successfully updated port: e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Oct 01 14:23:15 compute-0 nova_compute[192698]: 2025-10-01 14:23:15.778 2 DEBUG nova.compute.manager [req-2512aac7-85d4-481a-800b-8fdc1e14c215 req-417d0036-1742-4473-9091-e8d45daa3544 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Received event network-changed-e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:23:15 compute-0 nova_compute[192698]: 2025-10-01 14:23:15.779 2 DEBUG nova.compute.manager [req-2512aac7-85d4-481a-800b-8fdc1e14c215 req-417d0036-1742-4473-9091-e8d45daa3544 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Refreshing instance network info cache due to event network-changed-e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Oct 01 14:23:15 compute-0 nova_compute[192698]: 2025-10-01 14:23:15.779 2 DEBUG oslo_concurrency.lockutils [req-2512aac7-85d4-481a-800b-8fdc1e14c215 req-417d0036-1742-4473-9091-e8d45daa3544 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "refresh_cache-a8a45d3f-7256-468b-a779-ce1dd6daedd7" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 01 14:23:15 compute-0 nova_compute[192698]: 2025-10-01 14:23:15.779 2 DEBUG oslo_concurrency.lockutils [req-2512aac7-85d4-481a-800b-8fdc1e14c215 req-417d0036-1742-4473-9091-e8d45daa3544 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquired lock "refresh_cache-a8a45d3f-7256-468b-a779-ce1dd6daedd7" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 01 14:23:15 compute-0 nova_compute[192698]: 2025-10-01 14:23:15.780 2 DEBUG nova.network.neutron [req-2512aac7-85d4-481a-800b-8fdc1e14c215 req-417d0036-1742-4473-9091-e8d45daa3544 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Refreshing network info cache for port e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Oct 01 14:23:16 compute-0 nova_compute[192698]: 2025-10-01 14:23:16.209 2 DEBUG oslo_concurrency.lockutils [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Acquiring lock "refresh_cache-a8a45d3f-7256-468b-a779-ce1dd6daedd7" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 01 14:23:16 compute-0 nova_compute[192698]: 2025-10-01 14:23:16.286 2 WARNING neutronclient.v2_0.client [req-2512aac7-85d4-481a-800b-8fdc1e14c215 req-417d0036-1742-4473-9091-e8d45daa3544 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:23:16 compute-0 nova_compute[192698]: 2025-10-01 14:23:16.374 2 DEBUG nova.network.neutron [req-2512aac7-85d4-481a-800b-8fdc1e14c215 req-417d0036-1742-4473-9091-e8d45daa3544 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 01 14:23:16 compute-0 nova_compute[192698]: 2025-10-01 14:23:16.534 2 DEBUG nova.network.neutron [req-2512aac7-85d4-481a-800b-8fdc1e14c215 req-417d0036-1742-4473-9091-e8d45daa3544 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:23:17 compute-0 nova_compute[192698]: 2025-10-01 14:23:17.043 2 DEBUG oslo_concurrency.lockutils [req-2512aac7-85d4-481a-800b-8fdc1e14c215 req-417d0036-1742-4473-9091-e8d45daa3544 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Releasing lock "refresh_cache-a8a45d3f-7256-468b-a779-ce1dd6daedd7" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 01 14:23:17 compute-0 nova_compute[192698]: 2025-10-01 14:23:17.044 2 DEBUG oslo_concurrency.lockutils [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Acquired lock "refresh_cache-a8a45d3f-7256-468b-a779-ce1dd6daedd7" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 01 14:23:17 compute-0 nova_compute[192698]: 2025-10-01 14:23:17.045 2 DEBUG nova.network.neutron [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 01 14:23:17 compute-0 podman[223987]: 2025-10-01 14:23:17.178791087 +0000 UTC m=+0.079204548 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4)
Oct 01 14:23:17 compute-0 podman[223988]: 2025-10-01 14:23:17.240892131 +0000 UTC m=+0.141515317 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930)
Oct 01 14:23:17 compute-0 nova_compute[192698]: 2025-10-01 14:23:17.926 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:23:18 compute-0 nova_compute[192698]: 2025-10-01 14:23:18.388 2 DEBUG nova.network.neutron [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 01 14:23:18 compute-0 nova_compute[192698]: 2025-10-01 14:23:18.647 2 WARNING neutronclient.v2_0.client [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:23:19 compute-0 nova_compute[192698]: 2025-10-01 14:23:19.241 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:23:19 compute-0 nova_compute[192698]: 2025-10-01 14:23:19.397 2 DEBUG nova.network.neutron [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Updating instance_info_cache with network_info: [{"id": "e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc", "address": "fa:16:3e:40:6e:59", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1187b3a-50", "ovs_interfaceid": "e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:23:19 compute-0 nova_compute[192698]: 2025-10-01 14:23:19.905 2 DEBUG oslo_concurrency.lockutils [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Releasing lock "refresh_cache-a8a45d3f-7256-468b-a779-ce1dd6daedd7" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 01 14:23:19 compute-0 nova_compute[192698]: 2025-10-01 14:23:19.905 2 DEBUG nova.compute.manager [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Instance network_info: |[{"id": "e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc", "address": "fa:16:3e:40:6e:59", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1187b3a-50", "ovs_interfaceid": "e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Oct 01 14:23:19 compute-0 nova_compute[192698]: 2025-10-01 14:23:19.909 2 DEBUG nova.virt.libvirt.driver [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Start _get_guest_xml network_info=[{"id": "e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc", "address": "fa:16:3e:40:6e:59", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1187b3a-50", "ovs_interfaceid": "e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-01T13:57:39Z,direct_url=<?>,disk_format='qcow2',id=48696e9b-a20d-4bf6-8ac2-6438fe748ab6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9dacac6049d34f02846f752af09ae16f',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-01T13:57:40Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encrypted': False, 'encryption_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_options': None, 'device_name': '/dev/vda', 'guest_format': None, 'image_id': '48696e9b-a20d-4bf6-8ac2-6438fe748ab6'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Oct 01 14:23:19 compute-0 nova_compute[192698]: 2025-10-01 14:23:19.916 2 WARNING nova.virt.libvirt.driver [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:23:19 compute-0 nova_compute[192698]: 2025-10-01 14:23:19.918 2 DEBUG nova.virt.driver [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='48696e9b-a20d-4bf6-8ac2-6438fe748ab6', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteStrategies-server-901157653', uuid='a8a45d3f-7256-468b-a779-ce1dd6daedd7'), owner=OwnerMeta(userid='f8897741e6ca4770b56d28d05fa3fc42', username='tempest-TestExecuteStrategies-30131345-project-admin', projectid='d43115e3729442e1b68b749acc0dabc8', projectname='tempest-TestExecuteStrategies-30131345'), image=ImageMeta(id='48696e9b-a20d-4bf6-8ac2-6438fe748ab6', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='69702c4b-38f2-49d1-96d5-85671652c67e', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc", "address": "fa:16:3e:40:6e:59", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1187b3a-50", "ovs_interfaceid": "e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759328599.9186814) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Oct 01 14:23:19 compute-0 nova_compute[192698]: 2025-10-01 14:23:19.924 2 DEBUG nova.virt.libvirt.host [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Oct 01 14:23:19 compute-0 nova_compute[192698]: 2025-10-01 14:23:19.925 2 DEBUG nova.virt.libvirt.host [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Oct 01 14:23:19 compute-0 nova_compute[192698]: 2025-10-01 14:23:19.931 2 DEBUG nova.virt.libvirt.host [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Oct 01 14:23:19 compute-0 nova_compute[192698]: 2025-10-01 14:23:19.931 2 DEBUG nova.virt.libvirt.host [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Oct 01 14:23:19 compute-0 nova_compute[192698]: 2025-10-01 14:23:19.932 2 DEBUG nova.virt.libvirt.driver [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Oct 01 14:23:19 compute-0 nova_compute[192698]: 2025-10-01 14:23:19.933 2 DEBUG nova.virt.hardware [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-01T13:57:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='69702c4b-38f2-49d1-96d5-85671652c67e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-01T13:57:39Z,direct_url=<?>,disk_format='qcow2',id=48696e9b-a20d-4bf6-8ac2-6438fe748ab6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9dacac6049d34f02846f752af09ae16f',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-01T13:57:40Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Oct 01 14:23:19 compute-0 nova_compute[192698]: 2025-10-01 14:23:19.934 2 DEBUG nova.virt.hardware [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Oct 01 14:23:19 compute-0 nova_compute[192698]: 2025-10-01 14:23:19.934 2 DEBUG nova.virt.hardware [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Oct 01 14:23:19 compute-0 nova_compute[192698]: 2025-10-01 14:23:19.935 2 DEBUG nova.virt.hardware [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Oct 01 14:23:19 compute-0 nova_compute[192698]: 2025-10-01 14:23:19.935 2 DEBUG nova.virt.hardware [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Oct 01 14:23:19 compute-0 nova_compute[192698]: 2025-10-01 14:23:19.936 2 DEBUG nova.virt.hardware [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Oct 01 14:23:19 compute-0 nova_compute[192698]: 2025-10-01 14:23:19.936 2 DEBUG nova.virt.hardware [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Oct 01 14:23:19 compute-0 nova_compute[192698]: 2025-10-01 14:23:19.937 2 DEBUG nova.virt.hardware [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Oct 01 14:23:19 compute-0 nova_compute[192698]: 2025-10-01 14:23:19.937 2 DEBUG nova.virt.hardware [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Oct 01 14:23:19 compute-0 nova_compute[192698]: 2025-10-01 14:23:19.938 2 DEBUG nova.virt.hardware [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Oct 01 14:23:19 compute-0 nova_compute[192698]: 2025-10-01 14:23:19.938 2 DEBUG nova.virt.hardware [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Oct 01 14:23:19 compute-0 nova_compute[192698]: 2025-10-01 14:23:19.945 2 DEBUG nova.virt.libvirt.vif [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-01T14:23:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-901157653',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-901157653',id=23,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d43115e3729442e1b68b749acc0dabc8',ramdisk_id='',reservation_id='r-bsln7qq4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-30131345',owner_user_name='tempest-TestExecuteStrategies-30131345-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-01T14:23:13Z,user_data=None,user_id='f8897741e6ca4770b56d28d05fa3fc42',uuid=a8a45d3f-7256-468b-a779-ce1dd6daedd7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc", "address": "fa:16:3e:40:6e:59", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1187b3a-50", "ovs_interfaceid": "e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Oct 01 14:23:19 compute-0 nova_compute[192698]: 2025-10-01 14:23:19.945 2 DEBUG nova.network.os_vif_util [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Converting VIF {"id": "e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc", "address": "fa:16:3e:40:6e:59", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1187b3a-50", "ovs_interfaceid": "e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:23:19 compute-0 nova_compute[192698]: 2025-10-01 14:23:19.947 2 DEBUG nova.network.os_vif_util [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:40:6e:59,bridge_name='br-int',has_traffic_filtering=True,id=e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape1187b3a-50') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:23:19 compute-0 nova_compute[192698]: 2025-10-01 14:23:19.948 2 DEBUG nova.objects.instance [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lazy-loading 'pci_devices' on Instance uuid a8a45d3f-7256-468b-a779-ce1dd6daedd7 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:23:20 compute-0 nova_compute[192698]: 2025-10-01 14:23:20.327 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:23:20 compute-0 nova_compute[192698]: 2025-10-01 14:23:20.458 2 DEBUG nova.virt.libvirt.driver [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] End _get_guest_xml xml=<domain type="kvm">
Oct 01 14:23:20 compute-0 nova_compute[192698]:   <uuid>a8a45d3f-7256-468b-a779-ce1dd6daedd7</uuid>
Oct 01 14:23:20 compute-0 nova_compute[192698]:   <name>instance-00000017</name>
Oct 01 14:23:20 compute-0 nova_compute[192698]:   <memory>131072</memory>
Oct 01 14:23:20 compute-0 nova_compute[192698]:   <vcpu>1</vcpu>
Oct 01 14:23:20 compute-0 nova_compute[192698]:   <metadata>
Oct 01 14:23:20 compute-0 nova_compute[192698]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 01 14:23:20 compute-0 nova_compute[192698]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Oct 01 14:23:20 compute-0 nova_compute[192698]:       <nova:name>tempest-TestExecuteStrategies-server-901157653</nova:name>
Oct 01 14:23:20 compute-0 nova_compute[192698]:       <nova:creationTime>2025-10-01 14:23:19</nova:creationTime>
Oct 01 14:23:20 compute-0 nova_compute[192698]:       <nova:flavor name="m1.nano" id="69702c4b-38f2-49d1-96d5-85671652c67e">
Oct 01 14:23:20 compute-0 nova_compute[192698]:         <nova:memory>128</nova:memory>
Oct 01 14:23:20 compute-0 nova_compute[192698]:         <nova:disk>1</nova:disk>
Oct 01 14:23:20 compute-0 nova_compute[192698]:         <nova:swap>0</nova:swap>
Oct 01 14:23:20 compute-0 nova_compute[192698]:         <nova:ephemeral>0</nova:ephemeral>
Oct 01 14:23:20 compute-0 nova_compute[192698]:         <nova:vcpus>1</nova:vcpus>
Oct 01 14:23:20 compute-0 nova_compute[192698]:         <nova:extraSpecs>
Oct 01 14:23:20 compute-0 nova_compute[192698]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Oct 01 14:23:20 compute-0 nova_compute[192698]:         </nova:extraSpecs>
Oct 01 14:23:20 compute-0 nova_compute[192698]:       </nova:flavor>
Oct 01 14:23:20 compute-0 nova_compute[192698]:       <nova:image uuid="48696e9b-a20d-4bf6-8ac2-6438fe748ab6">
Oct 01 14:23:20 compute-0 nova_compute[192698]:         <nova:containerFormat>bare</nova:containerFormat>
Oct 01 14:23:20 compute-0 nova_compute[192698]:         <nova:diskFormat>qcow2</nova:diskFormat>
Oct 01 14:23:20 compute-0 nova_compute[192698]:         <nova:minDisk>1</nova:minDisk>
Oct 01 14:23:20 compute-0 nova_compute[192698]:         <nova:minRam>0</nova:minRam>
Oct 01 14:23:20 compute-0 nova_compute[192698]:         <nova:properties>
Oct 01 14:23:20 compute-0 nova_compute[192698]:           <nova:property name="hw_rng_model">virtio</nova:property>
Oct 01 14:23:20 compute-0 nova_compute[192698]:         </nova:properties>
Oct 01 14:23:20 compute-0 nova_compute[192698]:       </nova:image>
Oct 01 14:23:20 compute-0 nova_compute[192698]:       <nova:owner>
Oct 01 14:23:20 compute-0 nova_compute[192698]:         <nova:user uuid="f8897741e6ca4770b56d28d05fa3fc42">tempest-TestExecuteStrategies-30131345-project-admin</nova:user>
Oct 01 14:23:20 compute-0 nova_compute[192698]:         <nova:project uuid="d43115e3729442e1b68b749acc0dabc8">tempest-TestExecuteStrategies-30131345</nova:project>
Oct 01 14:23:20 compute-0 nova_compute[192698]:       </nova:owner>
Oct 01 14:23:20 compute-0 nova_compute[192698]:       <nova:root type="image" uuid="48696e9b-a20d-4bf6-8ac2-6438fe748ab6"/>
Oct 01 14:23:20 compute-0 nova_compute[192698]:       <nova:ports>
Oct 01 14:23:20 compute-0 nova_compute[192698]:         <nova:port uuid="e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc">
Oct 01 14:23:20 compute-0 nova_compute[192698]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Oct 01 14:23:20 compute-0 nova_compute[192698]:         </nova:port>
Oct 01 14:23:20 compute-0 nova_compute[192698]:       </nova:ports>
Oct 01 14:23:20 compute-0 nova_compute[192698]:     </nova:instance>
Oct 01 14:23:20 compute-0 nova_compute[192698]:   </metadata>
Oct 01 14:23:20 compute-0 nova_compute[192698]:   <sysinfo type="smbios">
Oct 01 14:23:20 compute-0 nova_compute[192698]:     <system>
Oct 01 14:23:20 compute-0 nova_compute[192698]:       <entry name="manufacturer">RDO</entry>
Oct 01 14:23:20 compute-0 nova_compute[192698]:       <entry name="product">OpenStack Compute</entry>
Oct 01 14:23:20 compute-0 nova_compute[192698]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Oct 01 14:23:20 compute-0 nova_compute[192698]:       <entry name="serial">a8a45d3f-7256-468b-a779-ce1dd6daedd7</entry>
Oct 01 14:23:20 compute-0 nova_compute[192698]:       <entry name="uuid">a8a45d3f-7256-468b-a779-ce1dd6daedd7</entry>
Oct 01 14:23:20 compute-0 nova_compute[192698]:       <entry name="family">Virtual Machine</entry>
Oct 01 14:23:20 compute-0 nova_compute[192698]:     </system>
Oct 01 14:23:20 compute-0 nova_compute[192698]:   </sysinfo>
Oct 01 14:23:20 compute-0 nova_compute[192698]:   <os>
Oct 01 14:23:20 compute-0 nova_compute[192698]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 01 14:23:20 compute-0 nova_compute[192698]:     <boot dev="hd"/>
Oct 01 14:23:20 compute-0 nova_compute[192698]:     <smbios mode="sysinfo"/>
Oct 01 14:23:20 compute-0 nova_compute[192698]:   </os>
Oct 01 14:23:20 compute-0 nova_compute[192698]:   <features>
Oct 01 14:23:20 compute-0 nova_compute[192698]:     <acpi/>
Oct 01 14:23:20 compute-0 nova_compute[192698]:     <apic/>
Oct 01 14:23:20 compute-0 nova_compute[192698]:     <vmcoreinfo/>
Oct 01 14:23:20 compute-0 nova_compute[192698]:   </features>
Oct 01 14:23:20 compute-0 nova_compute[192698]:   <clock offset="utc">
Oct 01 14:23:20 compute-0 nova_compute[192698]:     <timer name="pit" tickpolicy="delay"/>
Oct 01 14:23:20 compute-0 nova_compute[192698]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 01 14:23:20 compute-0 nova_compute[192698]:     <timer name="hpet" present="no"/>
Oct 01 14:23:20 compute-0 nova_compute[192698]:   </clock>
Oct 01 14:23:20 compute-0 nova_compute[192698]:   <cpu mode="host-model" match="exact">
Oct 01 14:23:20 compute-0 nova_compute[192698]:     <topology sockets="1" cores="1" threads="1"/>
Oct 01 14:23:20 compute-0 nova_compute[192698]:   </cpu>
Oct 01 14:23:20 compute-0 nova_compute[192698]:   <devices>
Oct 01 14:23:20 compute-0 nova_compute[192698]:     <disk type="file" device="disk">
Oct 01 14:23:20 compute-0 nova_compute[192698]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 01 14:23:20 compute-0 nova_compute[192698]:       <source file="/var/lib/nova/instances/a8a45d3f-7256-468b-a779-ce1dd6daedd7/disk"/>
Oct 01 14:23:20 compute-0 nova_compute[192698]:       <target dev="vda" bus="virtio"/>
Oct 01 14:23:20 compute-0 nova_compute[192698]:     </disk>
Oct 01 14:23:20 compute-0 nova_compute[192698]:     <disk type="file" device="cdrom">
Oct 01 14:23:20 compute-0 nova_compute[192698]:       <driver name="qemu" type="raw" cache="none"/>
Oct 01 14:23:20 compute-0 nova_compute[192698]:       <source file="/var/lib/nova/instances/a8a45d3f-7256-468b-a779-ce1dd6daedd7/disk.config"/>
Oct 01 14:23:20 compute-0 nova_compute[192698]:       <target dev="sda" bus="sata"/>
Oct 01 14:23:20 compute-0 nova_compute[192698]:     </disk>
Oct 01 14:23:20 compute-0 nova_compute[192698]:     <interface type="ethernet">
Oct 01 14:23:20 compute-0 nova_compute[192698]:       <mac address="fa:16:3e:40:6e:59"/>
Oct 01 14:23:20 compute-0 nova_compute[192698]:       <model type="virtio"/>
Oct 01 14:23:20 compute-0 nova_compute[192698]:       <driver name="vhost" rx_queue_size="512"/>
Oct 01 14:23:20 compute-0 nova_compute[192698]:       <mtu size="1442"/>
Oct 01 14:23:20 compute-0 nova_compute[192698]:       <target dev="tape1187b3a-50"/>
Oct 01 14:23:20 compute-0 nova_compute[192698]:     </interface>
Oct 01 14:23:20 compute-0 nova_compute[192698]:     <serial type="pty">
Oct 01 14:23:20 compute-0 nova_compute[192698]:       <log file="/var/lib/nova/instances/a8a45d3f-7256-468b-a779-ce1dd6daedd7/console.log" append="off"/>
Oct 01 14:23:20 compute-0 nova_compute[192698]:     </serial>
Oct 01 14:23:20 compute-0 nova_compute[192698]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 01 14:23:20 compute-0 nova_compute[192698]:     <video>
Oct 01 14:23:20 compute-0 nova_compute[192698]:       <model type="virtio"/>
Oct 01 14:23:20 compute-0 nova_compute[192698]:     </video>
Oct 01 14:23:20 compute-0 nova_compute[192698]:     <input type="tablet" bus="usb"/>
Oct 01 14:23:20 compute-0 nova_compute[192698]:     <rng model="virtio">
Oct 01 14:23:20 compute-0 nova_compute[192698]:       <backend model="random">/dev/urandom</backend>
Oct 01 14:23:20 compute-0 nova_compute[192698]:     </rng>
Oct 01 14:23:20 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root"/>
Oct 01 14:23:20 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:23:20 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:23:20 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:23:20 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:23:20 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:23:20 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:23:20 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:23:20 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:23:20 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:23:20 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:23:20 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:23:20 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:23:20 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:23:20 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:23:20 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:23:20 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:23:20 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:23:20 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:23:20 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:23:20 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:23:20 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:23:20 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:23:20 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:23:20 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:23:20 compute-0 nova_compute[192698]:     <controller type="usb" index="0"/>
Oct 01 14:23:20 compute-0 nova_compute[192698]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Oct 01 14:23:20 compute-0 nova_compute[192698]:       <stats period="10"/>
Oct 01 14:23:20 compute-0 nova_compute[192698]:     </memballoon>
Oct 01 14:23:20 compute-0 nova_compute[192698]:   </devices>
Oct 01 14:23:20 compute-0 nova_compute[192698]: </domain>
Oct 01 14:23:20 compute-0 nova_compute[192698]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Oct 01 14:23:20 compute-0 nova_compute[192698]: 2025-10-01 14:23:20.461 2 DEBUG nova.compute.manager [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Preparing to wait for external event network-vif-plugged-e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Oct 01 14:23:20 compute-0 nova_compute[192698]: 2025-10-01 14:23:20.461 2 DEBUG oslo_concurrency.lockutils [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Acquiring lock "a8a45d3f-7256-468b-a779-ce1dd6daedd7-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:23:20 compute-0 nova_compute[192698]: 2025-10-01 14:23:20.461 2 DEBUG oslo_concurrency.lockutils [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "a8a45d3f-7256-468b-a779-ce1dd6daedd7-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:23:20 compute-0 nova_compute[192698]: 2025-10-01 14:23:20.462 2 DEBUG oslo_concurrency.lockutils [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "a8a45d3f-7256-468b-a779-ce1dd6daedd7-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:23:20 compute-0 nova_compute[192698]: 2025-10-01 14:23:20.463 2 DEBUG nova.virt.libvirt.vif [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-01T14:23:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-901157653',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-901157653',id=23,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d43115e3729442e1b68b749acc0dabc8',ramdisk_id='',reservation_id='r-bsln7qq4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-30131345',owner_user_name='tempest-TestExecuteStrategies-30131345-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-01T14:23:13Z,user_data=None,user_id='f8897741e6ca4770b56d28d05fa3fc42',uuid=a8a45d3f-7256-468b-a779-ce1dd6daedd7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc", "address": "fa:16:3e:40:6e:59", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1187b3a-50", "ovs_interfaceid": "e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 01 14:23:20 compute-0 nova_compute[192698]: 2025-10-01 14:23:20.463 2 DEBUG nova.network.os_vif_util [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Converting VIF {"id": "e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc", "address": "fa:16:3e:40:6e:59", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1187b3a-50", "ovs_interfaceid": "e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:23:20 compute-0 nova_compute[192698]: 2025-10-01 14:23:20.464 2 DEBUG nova.network.os_vif_util [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:40:6e:59,bridge_name='br-int',has_traffic_filtering=True,id=e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape1187b3a-50') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:23:20 compute-0 nova_compute[192698]: 2025-10-01 14:23:20.465 2 DEBUG os_vif [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:40:6e:59,bridge_name='br-int',has_traffic_filtering=True,id=e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape1187b3a-50') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 01 14:23:20 compute-0 nova_compute[192698]: 2025-10-01 14:23:20.466 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:23:20 compute-0 nova_compute[192698]: 2025-10-01 14:23:20.466 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:23:20 compute-0 nova_compute[192698]: 2025-10-01 14:23:20.467 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:23:20 compute-0 nova_compute[192698]: 2025-10-01 14:23:20.468 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:23:20 compute-0 nova_compute[192698]: 2025-10-01 14:23:20.468 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'fdec25cd-8817-5a2f-ae14-ce6aa423a626', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:23:20 compute-0 nova_compute[192698]: 2025-10-01 14:23:20.470 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:23:20 compute-0 nova_compute[192698]: 2025-10-01 14:23:20.473 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:23:20 compute-0 nova_compute[192698]: 2025-10-01 14:23:20.477 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:23:20 compute-0 nova_compute[192698]: 2025-10-01 14:23:20.478 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape1187b3a-50, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:23:20 compute-0 nova_compute[192698]: 2025-10-01 14:23:20.478 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tape1187b3a-50, col_values=(('qos', UUID('0a51fd75-d2d3-4358-a2ce-263bc5a37581')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:23:20 compute-0 nova_compute[192698]: 2025-10-01 14:23:20.479 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tape1187b3a-50, col_values=(('external_ids', {'iface-id': 'e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:40:6e:59', 'vm-uuid': 'a8a45d3f-7256-468b-a779-ce1dd6daedd7'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:23:20 compute-0 nova_compute[192698]: 2025-10-01 14:23:20.480 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:23:20 compute-0 NetworkManager[51741]: <info>  [1759328600.4821] manager: (tape1187b3a-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/70)
Oct 01 14:23:20 compute-0 nova_compute[192698]: 2025-10-01 14:23:20.483 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:23:20 compute-0 nova_compute[192698]: 2025-10-01 14:23:20.491 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:23:20 compute-0 nova_compute[192698]: 2025-10-01 14:23:20.492 2 INFO os_vif [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:40:6e:59,bridge_name='br-int',has_traffic_filtering=True,id=e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape1187b3a-50')
Oct 01 14:23:22 compute-0 nova_compute[192698]: 2025-10-01 14:23:22.053 2 DEBUG nova.virt.libvirt.driver [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 01 14:23:22 compute-0 nova_compute[192698]: 2025-10-01 14:23:22.054 2 DEBUG nova.virt.libvirt.driver [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 01 14:23:22 compute-0 nova_compute[192698]: 2025-10-01 14:23:22.054 2 DEBUG nova.virt.libvirt.driver [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] No VIF found with MAC fa:16:3e:40:6e:59, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Oct 01 14:23:22 compute-0 nova_compute[192698]: 2025-10-01 14:23:22.055 2 INFO nova.virt.libvirt.driver [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Using config drive
Oct 01 14:23:22 compute-0 nova_compute[192698]: 2025-10-01 14:23:22.434 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:23:22 compute-0 nova_compute[192698]: 2025-10-01 14:23:22.435 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Oct 01 14:23:22 compute-0 nova_compute[192698]: 2025-10-01 14:23:22.569 2 WARNING neutronclient.v2_0.client [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:23:23 compute-0 nova_compute[192698]: 2025-10-01 14:23:23.375 2 INFO nova.virt.libvirt.driver [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Creating config drive at /var/lib/nova/instances/a8a45d3f-7256-468b-a779-ce1dd6daedd7/disk.config
Oct 01 14:23:23 compute-0 nova_compute[192698]: 2025-10-01 14:23:23.387 2 DEBUG oslo_concurrency.processutils [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a8a45d3f-7256-468b-a779-ce1dd6daedd7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpdnn_35fv execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:23:23 compute-0 nova_compute[192698]: 2025-10-01 14:23:23.432 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:23:23 compute-0 nova_compute[192698]: 2025-10-01 14:23:23.433 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:23:23 compute-0 nova_compute[192698]: 2025-10-01 14:23:23.535 2 DEBUG oslo_concurrency.processutils [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a8a45d3f-7256-468b-a779-ce1dd6daedd7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpdnn_35fv" returned: 0 in 0.148s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:23:23 compute-0 kernel: tape1187b3a-50: entered promiscuous mode
Oct 01 14:23:23 compute-0 ovn_controller[94909]: 2025-10-01T14:23:23Z|00180|binding|INFO|Claiming lport e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc for this chassis.
Oct 01 14:23:23 compute-0 ovn_controller[94909]: 2025-10-01T14:23:23Z|00181|binding|INFO|e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc: Claiming fa:16:3e:40:6e:59 10.100.0.5
Oct 01 14:23:23 compute-0 nova_compute[192698]: 2025-10-01 14:23:23.651 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:23:23 compute-0 NetworkManager[51741]: <info>  [1759328603.6536] manager: (tape1187b3a-50): new Tun device (/org/freedesktop/NetworkManager/Devices/71)
Oct 01 14:23:23 compute-0 ovn_controller[94909]: 2025-10-01T14:23:23Z|00182|binding|INFO|Setting lport e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc up in Southbound
Oct 01 14:23:23 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:23:23.680 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:40:6e:59 10.100.0.5'], port_security=['fa:16:3e:40:6e:59 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'a8a45d3f-7256-468b-a779-ce1dd6daedd7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-031a8987-8430-4fb6-a464-01e4dca2fae7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd43115e3729442e1b68b749acc0dabc8', 'neutron:revision_number': '4', 'neutron:security_group_ids': '43a3232d-93b1-43af-a9a3-1fde49b4460d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bd1914da-f1b0-4097-9d6b-24a3870871dc, chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], logical_port=e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:23:23 compute-0 ovn_controller[94909]: 2025-10-01T14:23:23Z|00183|binding|INFO|Setting lport e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc ovn-installed in OVS
Oct 01 14:23:23 compute-0 nova_compute[192698]: 2025-10-01 14:23:23.681 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:23:23 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:23:23.683 103791 INFO neutron.agent.ovn.metadata.agent [-] Port e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc in datapath 031a8987-8430-4fb6-a464-01e4dca2fae7 bound to our chassis
Oct 01 14:23:23 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:23:23.685 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 031a8987-8430-4fb6-a464-01e4dca2fae7
Oct 01 14:23:23 compute-0 nova_compute[192698]: 2025-10-01 14:23:23.696 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:23:23 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:23:23.701 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[599fd912-0282-4bb0-9d4e-be16f6ced39f]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:23:23 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:23:23.702 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap031a8987-81 in ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Oct 01 14:23:23 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:23:23.705 214114 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap031a8987-80 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Oct 01 14:23:23 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:23:23.706 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[ed9d450f-4b24-451d-bebb-0b1a97841f14]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:23:23 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:23:23.707 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[a1f16b78-0fc7-4c70-8560-94a65f57242f]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:23:23 compute-0 systemd-udevd[224065]: Network interface NamePolicy= disabled on kernel command line.
Oct 01 14:23:23 compute-0 systemd-machined[152704]: New machine qemu-17-instance-00000017.
Oct 01 14:23:23 compute-0 systemd[1]: Started Virtual Machine qemu-17-instance-00000017.
Oct 01 14:23:23 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:23:23.730 103910 DEBUG oslo.privsep.daemon [-] privsep: reply[9251d400-0262-40de-9880-1489551006a9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:23:23 compute-0 NetworkManager[51741]: <info>  [1759328603.7331] device (tape1187b3a-50): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 01 14:23:23 compute-0 NetworkManager[51741]: <info>  [1759328603.7344] device (tape1187b3a-50): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 01 14:23:23 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:23:23.749 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[ced5b3b2-1941-48c0-bd13-9e12abcf4dc2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:23:23 compute-0 podman[224044]: 2025-10-01 14:23:23.759843093 +0000 UTC m=+0.111946659 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, managed_by=edpm_ansible, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, release=1755695350, com.redhat.component=ubi9-minimal-container, config_id=edpm, vcs-type=git)
Oct 01 14:23:23 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:23:23.787 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[8012243e-fa69-4621-bd47-d2ea3f1c0dd4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:23:23 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:23:23.796 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[4a8a1d6a-d607-4eea-8160-61452e424e33]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:23:23 compute-0 NetworkManager[51741]: <info>  [1759328603.7983] manager: (tap031a8987-80): new Veth device (/org/freedesktop/NetworkManager/Devices/72)
Oct 01 14:23:23 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:23:23.846 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[58827e08-0db9-4ded-bc3f-a0aaa13d0986]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:23:23 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:23:23.849 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[0ac18220-be25-4529-b5e4-6e547d63e737]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:23:23 compute-0 NetworkManager[51741]: <info>  [1759328603.8828] device (tap031a8987-80): carrier: link connected
Oct 01 14:23:23 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:23:23.894 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[342c1601-017b-4600-8d4d-25a00d9b12ba]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:23:23 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:23:23.926 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[3279c5b1-598a-4a76-b5e2-43032e32939d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap031a8987-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:79:6c:81'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 54], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 496441, 'reachable_time': 41717, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224104, 'error': None, 'target': 'ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:23:23 compute-0 nova_compute[192698]: 2025-10-01 14:23:23.948 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:23:23 compute-0 nova_compute[192698]: 2025-10-01 14:23:23.949 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:23:23 compute-0 nova_compute[192698]: 2025-10-01 14:23:23.949 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:23:23 compute-0 nova_compute[192698]: 2025-10-01 14:23:23.949 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 01 14:23:23 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:23:23.966 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[3c46eb7e-4232-4d35-bd8d-2d5b6850ca42]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe79:6c81'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 496441, 'tstamp': 496441}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224105, 'error': None, 'target': 'ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:23:23 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:23:23.997 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[35afd667-c772-4d02-b1e0-7f47fdff7338]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap031a8987-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:79:6c:81'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 54], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 496441, 'reachable_time': 41717, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 224106, 'error': None, 'target': 'ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:23:24 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:23:24.053 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[dfeb14f7-7571-4c77-b995-dbb82e4024c6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:23:24 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:23:24.153 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[a0b4b7c5-c620-40bf-ad14-9967105cda56]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:23:24 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:23:24.155 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap031a8987-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:23:24 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:23:24.155 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:23:24 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:23:24.155 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap031a8987-80, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:23:24 compute-0 nova_compute[192698]: 2025-10-01 14:23:24.158 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:23:24 compute-0 NetworkManager[51741]: <info>  [1759328604.1586] manager: (tap031a8987-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/73)
Oct 01 14:23:24 compute-0 kernel: tap031a8987-80: entered promiscuous mode
Oct 01 14:23:24 compute-0 nova_compute[192698]: 2025-10-01 14:23:24.160 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:23:24 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:23:24.161 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap031a8987-80, col_values=(('external_ids', {'iface-id': '6dd814dc-cba2-4392-85ef-eadb8c4615f7'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:23:24 compute-0 nova_compute[192698]: 2025-10-01 14:23:24.162 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:23:24 compute-0 ovn_controller[94909]: 2025-10-01T14:23:24Z|00184|binding|INFO|Releasing lport 6dd814dc-cba2-4392-85ef-eadb8c4615f7 from this chassis (sb_readonly=0)
Oct 01 14:23:24 compute-0 nova_compute[192698]: 2025-10-01 14:23:24.186 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:23:24 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:23:24.190 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[955b5d80-6883-459a-bd48-3e5ae9428157]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:23:24 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:23:24.191 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:23:24 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:23:24.192 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:23:24 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:23:24.192 103791 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 031a8987-8430-4fb6-a464-01e4dca2fae7 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Oct 01 14:23:24 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:23:24.192 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:23:24 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:23:24.193 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[5021419a-2f36-430a-9689-6af519e40062]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:23:24 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:23:24.194 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:23:24 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:23:24.195 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[1915dd08-4cfa-4134-a733-a93933ea2b70]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:23:24 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:23:24.196 103791 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Oct 01 14:23:24 compute-0 ovn_metadata_agent[103777]: global
Oct 01 14:23:24 compute-0 ovn_metadata_agent[103777]:     log         /dev/log local0 debug
Oct 01 14:23:24 compute-0 ovn_metadata_agent[103777]:     log-tag     haproxy-metadata-proxy-031a8987-8430-4fb6-a464-01e4dca2fae7
Oct 01 14:23:24 compute-0 ovn_metadata_agent[103777]:     user        root
Oct 01 14:23:24 compute-0 ovn_metadata_agent[103777]:     group       root
Oct 01 14:23:24 compute-0 ovn_metadata_agent[103777]:     maxconn     1024
Oct 01 14:23:24 compute-0 ovn_metadata_agent[103777]:     pidfile     /var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy
Oct 01 14:23:24 compute-0 ovn_metadata_agent[103777]:     daemon
Oct 01 14:23:24 compute-0 ovn_metadata_agent[103777]: 
Oct 01 14:23:24 compute-0 ovn_metadata_agent[103777]: defaults
Oct 01 14:23:24 compute-0 ovn_metadata_agent[103777]:     log global
Oct 01 14:23:24 compute-0 ovn_metadata_agent[103777]:     mode http
Oct 01 14:23:24 compute-0 ovn_metadata_agent[103777]:     option httplog
Oct 01 14:23:24 compute-0 ovn_metadata_agent[103777]:     option dontlognull
Oct 01 14:23:24 compute-0 ovn_metadata_agent[103777]:     option http-server-close
Oct 01 14:23:24 compute-0 ovn_metadata_agent[103777]:     option forwardfor
Oct 01 14:23:24 compute-0 ovn_metadata_agent[103777]:     retries                 3
Oct 01 14:23:24 compute-0 ovn_metadata_agent[103777]:     timeout http-request    30s
Oct 01 14:23:24 compute-0 ovn_metadata_agent[103777]:     timeout connect         30s
Oct 01 14:23:24 compute-0 ovn_metadata_agent[103777]:     timeout client          32s
Oct 01 14:23:24 compute-0 ovn_metadata_agent[103777]:     timeout server          32s
Oct 01 14:23:24 compute-0 ovn_metadata_agent[103777]:     timeout http-keep-alive 30s
Oct 01 14:23:24 compute-0 ovn_metadata_agent[103777]: 
Oct 01 14:23:24 compute-0 ovn_metadata_agent[103777]: listen listener
Oct 01 14:23:24 compute-0 ovn_metadata_agent[103777]:     bind 169.254.169.254:80
Oct 01 14:23:24 compute-0 ovn_metadata_agent[103777]:     
Oct 01 14:23:24 compute-0 ovn_metadata_agent[103777]:     server metadata /var/lib/neutron/metadata_proxy
Oct 01 14:23:24 compute-0 ovn_metadata_agent[103777]: 
Oct 01 14:23:24 compute-0 ovn_metadata_agent[103777]:     http-request add-header X-OVN-Network-ID 031a8987-8430-4fb6-a464-01e4dca2fae7
Oct 01 14:23:24 compute-0 ovn_metadata_agent[103777]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Oct 01 14:23:24 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:23:24.199 103791 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7', 'env', 'PROCESS_TAG=haproxy-031a8987-8430-4fb6-a464-01e4dca2fae7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/031a8987-8430-4fb6-a464-01e4dca2fae7.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Oct 01 14:23:24 compute-0 nova_compute[192698]: 2025-10-01 14:23:24.244 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:23:24 compute-0 nova_compute[192698]: 2025-10-01 14:23:24.475 2 DEBUG nova.compute.manager [req-e9cac918-3cdd-43a4-98a3-3540fe0d42c8 req-d46b3a3b-5f71-42e8-9eb4-a3ee8285388e 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Received event network-vif-plugged-e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:23:24 compute-0 nova_compute[192698]: 2025-10-01 14:23:24.477 2 DEBUG oslo_concurrency.lockutils [req-e9cac918-3cdd-43a4-98a3-3540fe0d42c8 req-d46b3a3b-5f71-42e8-9eb4-a3ee8285388e 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "a8a45d3f-7256-468b-a779-ce1dd6daedd7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:23:24 compute-0 nova_compute[192698]: 2025-10-01 14:23:24.478 2 DEBUG oslo_concurrency.lockutils [req-e9cac918-3cdd-43a4-98a3-3540fe0d42c8 req-d46b3a3b-5f71-42e8-9eb4-a3ee8285388e 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "a8a45d3f-7256-468b-a779-ce1dd6daedd7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:23:24 compute-0 nova_compute[192698]: 2025-10-01 14:23:24.479 2 DEBUG oslo_concurrency.lockutils [req-e9cac918-3cdd-43a4-98a3-3540fe0d42c8 req-d46b3a3b-5f71-42e8-9eb4-a3ee8285388e 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "a8a45d3f-7256-468b-a779-ce1dd6daedd7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:23:24 compute-0 nova_compute[192698]: 2025-10-01 14:23:24.480 2 DEBUG nova.compute.manager [req-e9cac918-3cdd-43a4-98a3-3540fe0d42c8 req-d46b3a3b-5f71-42e8-9eb4-a3ee8285388e 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Processing event network-vif-plugged-e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Oct 01 14:23:24 compute-0 podman[224143]: 2025-10-01 14:23:24.672765935 +0000 UTC m=+0.085492437 container create fd774786fef49abb715a1037cd7521383ab0b17cf9ed98ee91bf81ce9fb8819d (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 01 14:23:24 compute-0 podman[224143]: 2025-10-01 14:23:24.630108594 +0000 UTC m=+0.042835156 image pull 0c139338a67144a0d88e07ef5f38b20d3085af4a1586fd8115d3776c8f9c633c 38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Oct 01 14:23:24 compute-0 systemd[1]: Started libpod-conmon-fd774786fef49abb715a1037cd7521383ab0b17cf9ed98ee91bf81ce9fb8819d.scope.
Oct 01 14:23:24 compute-0 systemd[1]: Started libcrun container.
Oct 01 14:23:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8ffa25a42151f3594e954d034789d64bd4bb58679cb199098bcfa8a4b2295d64/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 01 14:23:24 compute-0 podman[224143]: 2025-10-01 14:23:24.792656658 +0000 UTC m=+0.205383190 container init fd774786fef49abb715a1037cd7521383ab0b17cf9ed98ee91bf81ce9fb8819d (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930)
Oct 01 14:23:24 compute-0 podman[224143]: 2025-10-01 14:23:24.798613379 +0000 UTC m=+0.211339891 container start fd774786fef49abb715a1037cd7521383ab0b17cf9ed98ee91bf81ce9fb8819d (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2)
Oct 01 14:23:24 compute-0 neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7[224158]: [NOTICE]   (224162) : New worker (224164) forked
Oct 01 14:23:24 compute-0 neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7[224158]: [NOTICE]   (224162) : Loading success.
Oct 01 14:23:25 compute-0 nova_compute[192698]: 2025-10-01 14:23:25.011 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a8a45d3f-7256-468b-a779-ce1dd6daedd7/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:23:25 compute-0 nova_compute[192698]: 2025-10-01 14:23:25.097 2 DEBUG nova.compute.manager [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Oct 01 14:23:25 compute-0 nova_compute[192698]: 2025-10-01 14:23:25.100 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a8a45d3f-7256-468b-a779-ce1dd6daedd7/disk --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:23:25 compute-0 nova_compute[192698]: 2025-10-01 14:23:25.102 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a8a45d3f-7256-468b-a779-ce1dd6daedd7/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:23:25 compute-0 nova_compute[192698]: 2025-10-01 14:23:25.117 2 DEBUG nova.virt.libvirt.driver [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Oct 01 14:23:25 compute-0 nova_compute[192698]: 2025-10-01 14:23:25.129 2 INFO nova.virt.libvirt.driver [-] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Instance spawned successfully.
Oct 01 14:23:25 compute-0 nova_compute[192698]: 2025-10-01 14:23:25.130 2 DEBUG nova.virt.libvirt.driver [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Oct 01 14:23:25 compute-0 nova_compute[192698]: 2025-10-01 14:23:25.178 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a8a45d3f-7256-468b-a779-ce1dd6daedd7/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:23:25 compute-0 nova_compute[192698]: 2025-10-01 14:23:25.431 2 WARNING nova.virt.libvirt.driver [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:23:25 compute-0 nova_compute[192698]: 2025-10-01 14:23:25.432 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:23:25 compute-0 nova_compute[192698]: 2025-10-01 14:23:25.479 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.046s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:23:25 compute-0 nova_compute[192698]: 2025-10-01 14:23:25.480 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5817MB free_disk=73.3021240234375GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 01 14:23:25 compute-0 nova_compute[192698]: 2025-10-01 14:23:25.481 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:23:25 compute-0 nova_compute[192698]: 2025-10-01 14:23:25.481 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:23:25 compute-0 nova_compute[192698]: 2025-10-01 14:23:25.483 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:23:25 compute-0 nova_compute[192698]: 2025-10-01 14:23:25.655 2 DEBUG nova.virt.libvirt.driver [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:23:25 compute-0 nova_compute[192698]: 2025-10-01 14:23:25.656 2 DEBUG nova.virt.libvirt.driver [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:23:25 compute-0 nova_compute[192698]: 2025-10-01 14:23:25.657 2 DEBUG nova.virt.libvirt.driver [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:23:25 compute-0 nova_compute[192698]: 2025-10-01 14:23:25.657 2 DEBUG nova.virt.libvirt.driver [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:23:25 compute-0 nova_compute[192698]: 2025-10-01 14:23:25.658 2 DEBUG nova.virt.libvirt.driver [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:23:25 compute-0 nova_compute[192698]: 2025-10-01 14:23:25.659 2 DEBUG nova.virt.libvirt.driver [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:23:26 compute-0 nova_compute[192698]: 2025-10-01 14:23:26.175 2 INFO nova.compute.manager [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Took 11.65 seconds to spawn the instance on the hypervisor.
Oct 01 14:23:26 compute-0 nova_compute[192698]: 2025-10-01 14:23:26.176 2 DEBUG nova.compute.manager [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 01 14:23:26 compute-0 nova_compute[192698]: 2025-10-01 14:23:26.548 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Instance a8a45d3f-7256-468b-a779-ce1dd6daedd7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Oct 01 14:23:26 compute-0 nova_compute[192698]: 2025-10-01 14:23:26.549 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 01 14:23:26 compute-0 nova_compute[192698]: 2025-10-01 14:23:26.549 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:23:25 up  1:22,  0 user,  load average: 0.24, 0.24, 0.31\n', 'num_instances': '1', 'num_vm_building': '1', 'num_task_spawning': '1', 'num_os_type_None': '1', 'num_proj_d43115e3729442e1b68b749acc0dabc8': '1', 'io_workload': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 01 14:23:26 compute-0 nova_compute[192698]: 2025-10-01 14:23:26.570 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Refreshing inventories for resource provider ee1e54f5-453b-4949-a499-9a192f03b8f0 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Oct 01 14:23:26 compute-0 nova_compute[192698]: 2025-10-01 14:23:26.576 2 DEBUG nova.compute.manager [req-9a650020-f322-459c-b921-0dd06b7f9475 req-79b7c803-db6d-48bc-bd1f-0c76d65dd19c 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Received event network-vif-plugged-e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:23:26 compute-0 nova_compute[192698]: 2025-10-01 14:23:26.577 2 DEBUG oslo_concurrency.lockutils [req-9a650020-f322-459c-b921-0dd06b7f9475 req-79b7c803-db6d-48bc-bd1f-0c76d65dd19c 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "a8a45d3f-7256-468b-a779-ce1dd6daedd7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:23:26 compute-0 nova_compute[192698]: 2025-10-01 14:23:26.578 2 DEBUG oslo_concurrency.lockutils [req-9a650020-f322-459c-b921-0dd06b7f9475 req-79b7c803-db6d-48bc-bd1f-0c76d65dd19c 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "a8a45d3f-7256-468b-a779-ce1dd6daedd7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:23:26 compute-0 nova_compute[192698]: 2025-10-01 14:23:26.578 2 DEBUG oslo_concurrency.lockutils [req-9a650020-f322-459c-b921-0dd06b7f9475 req-79b7c803-db6d-48bc-bd1f-0c76d65dd19c 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "a8a45d3f-7256-468b-a779-ce1dd6daedd7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:23:26 compute-0 nova_compute[192698]: 2025-10-01 14:23:26.578 2 DEBUG nova.compute.manager [req-9a650020-f322-459c-b921-0dd06b7f9475 req-79b7c803-db6d-48bc-bd1f-0c76d65dd19c 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] No waiting events found dispatching network-vif-plugged-e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:23:26 compute-0 nova_compute[192698]: 2025-10-01 14:23:26.579 2 WARNING nova.compute.manager [req-9a650020-f322-459c-b921-0dd06b7f9475 req-79b7c803-db6d-48bc-bd1f-0c76d65dd19c 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Received unexpected event network-vif-plugged-e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc for instance with vm_state active and task_state None.
Oct 01 14:23:26 compute-0 nova_compute[192698]: 2025-10-01 14:23:26.597 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Updating ProviderTree inventory for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Oct 01 14:23:26 compute-0 nova_compute[192698]: 2025-10-01 14:23:26.597 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Updating inventory in ProviderTree for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Oct 01 14:23:26 compute-0 nova_compute[192698]: 2025-10-01 14:23:26.611 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Refreshing aggregate associations for resource provider ee1e54f5-453b-4949-a499-9a192f03b8f0, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Oct 01 14:23:26 compute-0 nova_compute[192698]: 2025-10-01 14:23:26.634 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Refreshing trait associations for resource provider ee1e54f5-453b-4949-a499-9a192f03b8f0, traits: COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_TPM_TIS,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_BMI2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_AVX2,COMPUTE_ARCH_X86_64,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SHA,COMPUTE_SOUND_MODEL_AC97,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_SOUND_MODEL_ES1370,HW_ARCH_X86_64,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_DEVICE_TAGGING,COMPUTE_SECURITY_TPM_CRB,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SOUND_MODEL_SB16,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SOUND_MODEL_USB,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_ACCELERATORS,HW_CPU_X86_CLMUL,HW_CPU_X86_AESNI,COMPUTE_NODE,HW_CPU_X86_SSSE3,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_RESCUE_BFV,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_FMA3,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_AVX,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_F16C,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_ABM,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_VIRTIO_FS,HW_CPU_X86_SSE2,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE4A,HW_CPU_X86_SVM _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Oct 01 14:23:26 compute-0 nova_compute[192698]: 2025-10-01 14:23:26.672 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:23:26 compute-0 nova_compute[192698]: 2025-10-01 14:23:26.722 2 INFO nova.compute.manager [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Took 16.88 seconds to build instance.
Oct 01 14:23:27 compute-0 nova_compute[192698]: 2025-10-01 14:23:27.181 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:23:27 compute-0 nova_compute[192698]: 2025-10-01 14:23:27.227 2 DEBUG oslo_concurrency.lockutils [None req-a9f021ef-8958-42fa-ba25-2e8f7c5a1d18 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "a8a45d3f-7256-468b-a779-ce1dd6daedd7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.403s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:23:27 compute-0 nova_compute[192698]: 2025-10-01 14:23:27.690 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 01 14:23:27 compute-0 nova_compute[192698]: 2025-10-01 14:23:27.690 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.209s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:23:28 compute-0 nova_compute[192698]: 2025-10-01 14:23:28.172 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:23:28 compute-0 nova_compute[192698]: 2025-10-01 14:23:28.683 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:23:28 compute-0 nova_compute[192698]: 2025-10-01 14:23:28.683 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:23:28 compute-0 nova_compute[192698]: 2025-10-01 14:23:28.684 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:23:29 compute-0 nova_compute[192698]: 2025-10-01 14:23:29.248 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:23:29 compute-0 podman[203144]: time="2025-10-01T14:23:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:23:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:23:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20750 "" "Go-http-client/1.1"
Oct 01 14:23:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:23:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3483 "" "Go-http-client/1.1"
Oct 01 14:23:30 compute-0 podman[224180]: 2025-10-01 14:23:30.210649329 +0000 UTC m=+0.119760811 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team)
Oct 01 14:23:30 compute-0 podman[224181]: 2025-10-01 14:23:30.228456549 +0000 UTC m=+0.129481623 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4)
Oct 01 14:23:30 compute-0 nova_compute[192698]: 2025-10-01 14:23:30.424 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:23:30 compute-0 nova_compute[192698]: 2025-10-01 14:23:30.487 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:23:31 compute-0 openstack_network_exporter[205307]: ERROR   14:23:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:23:31 compute-0 openstack_network_exporter[205307]: ERROR   14:23:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:23:31 compute-0 openstack_network_exporter[205307]: ERROR   14:23:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:23:31 compute-0 openstack_network_exporter[205307]: ERROR   14:23:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:23:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:23:31 compute-0 openstack_network_exporter[205307]: ERROR   14:23:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:23:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:23:31 compute-0 nova_compute[192698]: 2025-10-01 14:23:31.924 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:23:33 compute-0 nova_compute[192698]: 2025-10-01 14:23:33.924 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:23:33 compute-0 nova_compute[192698]: 2025-10-01 14:23:33.926 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 01 14:23:34 compute-0 nova_compute[192698]: 2025-10-01 14:23:34.250 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:23:35 compute-0 nova_compute[192698]: 2025-10-01 14:23:35.492 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:23:36 compute-0 podman[224225]: 2025-10-01 14:23:36.143204517 +0000 UTC m=+0.058716395 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 01 14:23:38 compute-0 ovn_controller[94909]: 2025-10-01T14:23:38Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:40:6e:59 10.100.0.5
Oct 01 14:23:38 compute-0 ovn_controller[94909]: 2025-10-01T14:23:38Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:40:6e:59 10.100.0.5
Oct 01 14:23:39 compute-0 nova_compute[192698]: 2025-10-01 14:23:39.270 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:23:39 compute-0 nova_compute[192698]: 2025-10-01 14:23:39.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:23:39 compute-0 nova_compute[192698]: 2025-10-01 14:23:39.926 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Oct 01 14:23:40 compute-0 nova_compute[192698]: 2025-10-01 14:23:40.497 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:23:40 compute-0 nova_compute[192698]: 2025-10-01 14:23:40.752 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Oct 01 14:23:44 compute-0 nova_compute[192698]: 2025-10-01 14:23:44.273 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:23:45 compute-0 nova_compute[192698]: 2025-10-01 14:23:45.499 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:23:48 compute-0 podman[224250]: 2025-10-01 14:23:48.18139308 +0000 UTC m=+0.081938311 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Oct 01 14:23:48 compute-0 podman[224251]: 2025-10-01 14:23:48.22701818 +0000 UTC m=+0.123235644 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS)
Oct 01 14:23:49 compute-0 nova_compute[192698]: 2025-10-01 14:23:49.275 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:23:50 compute-0 nova_compute[192698]: 2025-10-01 14:23:50.503 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:23:53 compute-0 ovn_controller[94909]: 2025-10-01T14:23:53Z|00185|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Oct 01 14:23:54 compute-0 nova_compute[192698]: 2025-10-01 14:23:54.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:23:54 compute-0 nova_compute[192698]: 2025-10-01 14:23:54.494 2 DEBUG nova.compute.manager [None req-e361a8d7-5aaa-489e-9e18-28229ba3187f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider ee1e54f5-453b-4949-a499-9a192f03b8f0 in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:635
Oct 01 14:23:54 compute-0 nova_compute[192698]: 2025-10-01 14:23:54.578 2 DEBUG nova.compute.provider_tree [None req-e361a8d7-5aaa-489e-9e18-28229ba3187f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Updating resource provider ee1e54f5-453b-4949-a499-9a192f03b8f0 generation from 28 to 29 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Oct 01 14:23:55 compute-0 podman[224296]: 2025-10-01 14:23:55.319666604 +0000 UTC m=+1.232566303 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, release=1755695350, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public)
Oct 01 14:23:55 compute-0 nova_compute[192698]: 2025-10-01 14:23:55.507 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:23:59 compute-0 nova_compute[192698]: 2025-10-01 14:23:59.351 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:23:59 compute-0 nova_compute[192698]: 2025-10-01 14:23:59.439 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:23:59 compute-0 podman[203144]: time="2025-10-01T14:23:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:23:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:23:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20750 "" "Go-http-client/1.1"
Oct 01 14:23:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:23:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3490 "" "Go-http-client/1.1"
Oct 01 14:24:00 compute-0 nova_compute[192698]: 2025-10-01 14:24:00.515 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:24:01 compute-0 podman[224319]: 2025-10-01 14:24:01.160788876 +0000 UTC m=+0.076529345 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=multipathd)
Oct 01 14:24:01 compute-0 podman[224318]: 2025-10-01 14:24:01.163348585 +0000 UTC m=+0.072845316 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=iscsid, container_name=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Oct 01 14:24:01 compute-0 openstack_network_exporter[205307]: ERROR   14:24:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:24:01 compute-0 openstack_network_exporter[205307]: ERROR   14:24:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:24:01 compute-0 openstack_network_exporter[205307]: ERROR   14:24:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:24:01 compute-0 openstack_network_exporter[205307]: ERROR   14:24:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:24:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:24:01 compute-0 openstack_network_exporter[205307]: ERROR   14:24:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:24:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:24:02 compute-0 nova_compute[192698]: 2025-10-01 14:24:02.102 2 DEBUG nova.virt.libvirt.driver [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Check if temp file /var/lib/nova/instances/tmp7i483hgb exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Oct 01 14:24:02 compute-0 nova_compute[192698]: 2025-10-01 14:24:02.107 2 DEBUG nova.compute.manager [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp7i483hgb',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='a8a45d3f-7256-468b-a779-ce1dd6daedd7',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9294
Oct 01 14:24:04 compute-0 nova_compute[192698]: 2025-10-01 14:24:04.352 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:24:05 compute-0 nova_compute[192698]: 2025-10-01 14:24:05.569 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:24:06 compute-0 nova_compute[192698]: 2025-10-01 14:24:06.669 2 DEBUG oslo_concurrency.processutils [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a8a45d3f-7256-468b-a779-ce1dd6daedd7/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:24:06 compute-0 nova_compute[192698]: 2025-10-01 14:24:06.754 2 DEBUG oslo_concurrency.processutils [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a8a45d3f-7256-468b-a779-ce1dd6daedd7/disk --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:24:06 compute-0 nova_compute[192698]: 2025-10-01 14:24:06.755 2 DEBUG oslo_concurrency.processutils [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a8a45d3f-7256-468b-a779-ce1dd6daedd7/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:24:06 compute-0 nova_compute[192698]: 2025-10-01 14:24:06.819 2 DEBUG oslo_concurrency.processutils [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a8a45d3f-7256-468b-a779-ce1dd6daedd7/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:24:06 compute-0 nova_compute[192698]: 2025-10-01 14:24:06.822 2 DEBUG nova.compute.manager [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Preparing to wait for external event network-vif-plugged-e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Oct 01 14:24:06 compute-0 nova_compute[192698]: 2025-10-01 14:24:06.822 2 DEBUG oslo_concurrency.lockutils [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "a8a45d3f-7256-468b-a779-ce1dd6daedd7-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:24:06 compute-0 nova_compute[192698]: 2025-10-01 14:24:06.823 2 DEBUG oslo_concurrency.lockutils [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "a8a45d3f-7256-468b-a779-ce1dd6daedd7-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:24:06 compute-0 nova_compute[192698]: 2025-10-01 14:24:06.824 2 DEBUG oslo_concurrency.lockutils [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "a8a45d3f-7256-468b-a779-ce1dd6daedd7-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:24:07 compute-0 podman[224365]: 2025-10-01 14:24:07.190605387 +0000 UTC m=+0.097504321 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 01 14:24:09 compute-0 nova_compute[192698]: 2025-10-01 14:24:09.353 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:24:10 compute-0 nova_compute[192698]: 2025-10-01 14:24:10.571 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:24:12 compute-0 nova_compute[192698]: 2025-10-01 14:24:12.637 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:24:12 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:24:12.637 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'e2:3f:3c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:1d:a6:67:ed:e6'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:24:12 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:24:12.638 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 01 14:24:12 compute-0 nova_compute[192698]: 2025-10-01 14:24:12.662 2 DEBUG nova.compute.manager [req-a50f58e9-1b6e-40a9-88c5-4ccd4fa9c235 req-6b97f595-0e32-4c5e-a5b7-d44a20046c1a 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Received event network-vif-unplugged-e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:24:12 compute-0 nova_compute[192698]: 2025-10-01 14:24:12.663 2 DEBUG oslo_concurrency.lockutils [req-a50f58e9-1b6e-40a9-88c5-4ccd4fa9c235 req-6b97f595-0e32-4c5e-a5b7-d44a20046c1a 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "a8a45d3f-7256-468b-a779-ce1dd6daedd7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:24:12 compute-0 nova_compute[192698]: 2025-10-01 14:24:12.663 2 DEBUG oslo_concurrency.lockutils [req-a50f58e9-1b6e-40a9-88c5-4ccd4fa9c235 req-6b97f595-0e32-4c5e-a5b7-d44a20046c1a 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "a8a45d3f-7256-468b-a779-ce1dd6daedd7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:24:12 compute-0 nova_compute[192698]: 2025-10-01 14:24:12.663 2 DEBUG oslo_concurrency.lockutils [req-a50f58e9-1b6e-40a9-88c5-4ccd4fa9c235 req-6b97f595-0e32-4c5e-a5b7-d44a20046c1a 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "a8a45d3f-7256-468b-a779-ce1dd6daedd7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:24:12 compute-0 nova_compute[192698]: 2025-10-01 14:24:12.664 2 DEBUG nova.compute.manager [req-a50f58e9-1b6e-40a9-88c5-4ccd4fa9c235 req-6b97f595-0e32-4c5e-a5b7-d44a20046c1a 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] No event matching network-vif-unplugged-e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc in dict_keys([('network-vif-plugged', 'e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:349
Oct 01 14:24:12 compute-0 nova_compute[192698]: 2025-10-01 14:24:12.664 2 DEBUG nova.compute.manager [req-a50f58e9-1b6e-40a9-88c5-4ccd4fa9c235 req-6b97f595-0e32-4c5e-a5b7-d44a20046c1a 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Received event network-vif-unplugged-e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 01 14:24:13 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Oct 01 14:24:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:24:14.282 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:24:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:24:14.283 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:24:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:24:14.283 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:24:14 compute-0 nova_compute[192698]: 2025-10-01 14:24:14.352 2 INFO nova.compute.manager [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Took 7.53 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Oct 01 14:24:14 compute-0 nova_compute[192698]: 2025-10-01 14:24:14.354 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:24:14 compute-0 nova_compute[192698]: 2025-10-01 14:24:14.719 2 DEBUG nova.compute.manager [req-3a6a3b16-8fa9-48e2-a595-31960d753150 req-1b9f9fb5-1630-436b-a0f0-af01970b4257 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Received event network-vif-plugged-e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:24:14 compute-0 nova_compute[192698]: 2025-10-01 14:24:14.720 2 DEBUG oslo_concurrency.lockutils [req-3a6a3b16-8fa9-48e2-a595-31960d753150 req-1b9f9fb5-1630-436b-a0f0-af01970b4257 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "a8a45d3f-7256-468b-a779-ce1dd6daedd7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:24:14 compute-0 nova_compute[192698]: 2025-10-01 14:24:14.720 2 DEBUG oslo_concurrency.lockutils [req-3a6a3b16-8fa9-48e2-a595-31960d753150 req-1b9f9fb5-1630-436b-a0f0-af01970b4257 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "a8a45d3f-7256-468b-a779-ce1dd6daedd7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:24:14 compute-0 nova_compute[192698]: 2025-10-01 14:24:14.720 2 DEBUG oslo_concurrency.lockutils [req-3a6a3b16-8fa9-48e2-a595-31960d753150 req-1b9f9fb5-1630-436b-a0f0-af01970b4257 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "a8a45d3f-7256-468b-a779-ce1dd6daedd7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:24:14 compute-0 nova_compute[192698]: 2025-10-01 14:24:14.720 2 DEBUG nova.compute.manager [req-3a6a3b16-8fa9-48e2-a595-31960d753150 req-1b9f9fb5-1630-436b-a0f0-af01970b4257 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Processing event network-vif-plugged-e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Oct 01 14:24:14 compute-0 nova_compute[192698]: 2025-10-01 14:24:14.721 2 DEBUG nova.compute.manager [req-3a6a3b16-8fa9-48e2-a595-31960d753150 req-1b9f9fb5-1630-436b-a0f0-af01970b4257 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Received event network-changed-e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:24:14 compute-0 nova_compute[192698]: 2025-10-01 14:24:14.721 2 DEBUG nova.compute.manager [req-3a6a3b16-8fa9-48e2-a595-31960d753150 req-1b9f9fb5-1630-436b-a0f0-af01970b4257 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Refreshing instance network info cache due to event network-changed-e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Oct 01 14:24:14 compute-0 nova_compute[192698]: 2025-10-01 14:24:14.721 2 DEBUG oslo_concurrency.lockutils [req-3a6a3b16-8fa9-48e2-a595-31960d753150 req-1b9f9fb5-1630-436b-a0f0-af01970b4257 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "refresh_cache-a8a45d3f-7256-468b-a779-ce1dd6daedd7" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 01 14:24:14 compute-0 nova_compute[192698]: 2025-10-01 14:24:14.721 2 DEBUG oslo_concurrency.lockutils [req-3a6a3b16-8fa9-48e2-a595-31960d753150 req-1b9f9fb5-1630-436b-a0f0-af01970b4257 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquired lock "refresh_cache-a8a45d3f-7256-468b-a779-ce1dd6daedd7" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 01 14:24:14 compute-0 nova_compute[192698]: 2025-10-01 14:24:14.721 2 DEBUG nova.network.neutron [req-3a6a3b16-8fa9-48e2-a595-31960d753150 req-1b9f9fb5-1630-436b-a0f0-af01970b4257 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Refreshing network info cache for port e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Oct 01 14:24:14 compute-0 nova_compute[192698]: 2025-10-01 14:24:14.722 2 DEBUG nova.compute.manager [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Oct 01 14:24:15 compute-0 nova_compute[192698]: 2025-10-01 14:24:15.229 2 WARNING neutronclient.v2_0.client [req-3a6a3b16-8fa9-48e2-a595-31960d753150 req-1b9f9fb5-1630-436b-a0f0-af01970b4257 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:24:15 compute-0 nova_compute[192698]: 2025-10-01 14:24:15.239 2 DEBUG nova.compute.manager [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp7i483hgb',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='a8a45d3f-7256-468b-a779-ce1dd6daedd7',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(56aa8204-2aa9-4162-80de-f3f81c7131b1),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9659
Oct 01 14:24:15 compute-0 nova_compute[192698]: 2025-10-01 14:24:15.575 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:24:15 compute-0 nova_compute[192698]: 2025-10-01 14:24:15.761 2 DEBUG nova.objects.instance [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lazy-loading 'migration_context' on Instance uuid a8a45d3f-7256-468b-a779-ce1dd6daedd7 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:24:15 compute-0 nova_compute[192698]: 2025-10-01 14:24:15.763 2 DEBUG nova.virt.libvirt.driver [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Oct 01 14:24:15 compute-0 nova_compute[192698]: 2025-10-01 14:24:15.765 2 DEBUG nova.virt.libvirt.driver [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Oct 01 14:24:15 compute-0 nova_compute[192698]: 2025-10-01 14:24:15.766 2 DEBUG nova.virt.libvirt.driver [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Oct 01 14:24:16 compute-0 nova_compute[192698]: 2025-10-01 14:24:16.270 2 DEBUG nova.virt.libvirt.driver [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Oct 01 14:24:16 compute-0 nova_compute[192698]: 2025-10-01 14:24:16.271 2 DEBUG nova.virt.libvirt.driver [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Oct 01 14:24:16 compute-0 nova_compute[192698]: 2025-10-01 14:24:16.278 2 DEBUG nova.virt.libvirt.vif [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-10-01T14:23:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-901157653',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-901157653',id=23,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-01T14:23:26Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d43115e3729442e1b68b749acc0dabc8',ramdisk_id='',reservation_id='r-bsln7qq4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-30131345',owner_user_name='tempest-TestExecuteStrategies-30131345-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-01T14:23:26Z,user_data=None,user_id='f8897741e6ca4770b56d28d05fa3fc42',uuid=a8a45d3f-7256-468b-a779-ce1dd6daedd7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc", "address": "fa:16:3e:40:6e:59", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tape1187b3a-50", "ovs_interfaceid": "e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Oct 01 14:24:16 compute-0 nova_compute[192698]: 2025-10-01 14:24:16.279 2 DEBUG nova.network.os_vif_util [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Converting VIF {"id": "e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc", "address": "fa:16:3e:40:6e:59", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tape1187b3a-50", "ovs_interfaceid": "e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:24:16 compute-0 nova_compute[192698]: 2025-10-01 14:24:16.280 2 DEBUG nova.network.os_vif_util [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:40:6e:59,bridge_name='br-int',has_traffic_filtering=True,id=e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape1187b3a-50') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:24:16 compute-0 nova_compute[192698]: 2025-10-01 14:24:16.281 2 DEBUG nova.virt.libvirt.migration [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Updating guest XML with vif config: <interface type="ethernet">
Oct 01 14:24:16 compute-0 nova_compute[192698]:   <mac address="fa:16:3e:40:6e:59"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   <model type="virtio"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   <driver name="vhost" rx_queue_size="512"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   <mtu size="1442"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   <target dev="tape1187b3a-50"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]: </interface>
Oct 01 14:24:16 compute-0 nova_compute[192698]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Oct 01 14:24:16 compute-0 nova_compute[192698]: 2025-10-01 14:24:16.282 2 DEBUG nova.virt.libvirt.migration [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Oct 01 14:24:16 compute-0 nova_compute[192698]:   <name>instance-00000017</name>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   <uuid>a8a45d3f-7256-468b-a779-ce1dd6daedd7</uuid>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   <metadata>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <nova:name>tempest-TestExecuteStrategies-server-901157653</nova:name>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <nova:creationTime>2025-10-01 14:23:19</nova:creationTime>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <nova:flavor name="m1.nano" id="69702c4b-38f2-49d1-96d5-85671652c67e">
Oct 01 14:24:16 compute-0 nova_compute[192698]:         <nova:memory>128</nova:memory>
Oct 01 14:24:16 compute-0 nova_compute[192698]:         <nova:disk>1</nova:disk>
Oct 01 14:24:16 compute-0 nova_compute[192698]:         <nova:swap>0</nova:swap>
Oct 01 14:24:16 compute-0 nova_compute[192698]:         <nova:ephemeral>0</nova:ephemeral>
Oct 01 14:24:16 compute-0 nova_compute[192698]:         <nova:vcpus>1</nova:vcpus>
Oct 01 14:24:16 compute-0 nova_compute[192698]:         <nova:extraSpecs>
Oct 01 14:24:16 compute-0 nova_compute[192698]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Oct 01 14:24:16 compute-0 nova_compute[192698]:         </nova:extraSpecs>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       </nova:flavor>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <nova:image uuid="48696e9b-a20d-4bf6-8ac2-6438fe748ab6">
Oct 01 14:24:16 compute-0 nova_compute[192698]:         <nova:containerFormat>bare</nova:containerFormat>
Oct 01 14:24:16 compute-0 nova_compute[192698]:         <nova:diskFormat>qcow2</nova:diskFormat>
Oct 01 14:24:16 compute-0 nova_compute[192698]:         <nova:minDisk>1</nova:minDisk>
Oct 01 14:24:16 compute-0 nova_compute[192698]:         <nova:minRam>0</nova:minRam>
Oct 01 14:24:16 compute-0 nova_compute[192698]:         <nova:properties>
Oct 01 14:24:16 compute-0 nova_compute[192698]:           <nova:property name="hw_rng_model">virtio</nova:property>
Oct 01 14:24:16 compute-0 nova_compute[192698]:         </nova:properties>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       </nova:image>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <nova:owner>
Oct 01 14:24:16 compute-0 nova_compute[192698]:         <nova:user uuid="f8897741e6ca4770b56d28d05fa3fc42">tempest-TestExecuteStrategies-30131345-project-admin</nova:user>
Oct 01 14:24:16 compute-0 nova_compute[192698]:         <nova:project uuid="d43115e3729442e1b68b749acc0dabc8">tempest-TestExecuteStrategies-30131345</nova:project>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       </nova:owner>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <nova:root type="image" uuid="48696e9b-a20d-4bf6-8ac2-6438fe748ab6"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <nova:ports>
Oct 01 14:24:16 compute-0 nova_compute[192698]:         <nova:port uuid="e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc">
Oct 01 14:24:16 compute-0 nova_compute[192698]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:         </nova:port>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       </nova:ports>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </nova:instance>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   </metadata>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   <memory unit="KiB">131072</memory>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   <currentMemory unit="KiB">131072</currentMemory>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   <vcpu placement="static">1</vcpu>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   <resource>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <partition>/machine</partition>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   </resource>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   <sysinfo type="smbios">
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <system>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <entry name="manufacturer">RDO</entry>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <entry name="product">OpenStack Compute</entry>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <entry name="serial">a8a45d3f-7256-468b-a779-ce1dd6daedd7</entry>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <entry name="uuid">a8a45d3f-7256-468b-a779-ce1dd6daedd7</entry>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <entry name="family">Virtual Machine</entry>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </system>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   </sysinfo>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   <os>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <boot dev="hd"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <smbios mode="sysinfo"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   </os>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   <features>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <acpi/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <apic/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <vmcoreinfo state="on"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   </features>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   <cpu mode="host-model" check="partial">
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   </cpu>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   <clock offset="utc">
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <timer name="pit" tickpolicy="delay"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <timer name="hpet" present="no"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   </clock>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   <on_poweroff>destroy</on_poweroff>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   <on_reboot>restart</on_reboot>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   <on_crash>destroy</on_crash>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   <devices>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <disk type="file" device="disk">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <source file="/var/lib/nova/instances/a8a45d3f-7256-468b-a779-ce1dd6daedd7/disk"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target dev="vda" bus="virtio"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </disk>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <disk type="file" device="cdrom">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <driver name="qemu" type="raw" cache="none"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <source file="/var/lib/nova/instances/a8a45d3f-7256-468b-a779-ce1dd6daedd7/disk.config"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target dev="sda" bus="sata"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <readonly/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </disk>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="0" model="pcie-root"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="1" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="1" port="0x10"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="2" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="2" port="0x11"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="3" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="3" port="0x12"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="4" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="4" port="0x13"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="5" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="5" port="0x14"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="6" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="6" port="0x15"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="7" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="7" port="0x16"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="8" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="8" port="0x17"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="9" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="9" port="0x18"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="10" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="10" port="0x19"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="11" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="11" port="0x1a"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="12" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="12" port="0x1b"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="13" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="13" port="0x1c"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="14" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="14" port="0x1d"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="15" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="15" port="0x1e"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="16" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="16" port="0x1f"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="17" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="17" port="0x20"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="18" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="18" port="0x21"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="19" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="19" port="0x22"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="20" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="20" port="0x23"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="21" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="21" port="0x24"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="22" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="22" port="0x25"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="23" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="23" port="0x26"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="24" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="24" port="0x27"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="25" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="25" port="0x28"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-pci-bridge"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="usb" index="0" model="piix3-uhci">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="sata" index="0">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <interface type="ethernet"><mac address="fa:16:3e:40:6e:59"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tape1187b3a-50"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </interface><serial type="pty">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <log file="/var/lib/nova/instances/a8a45d3f-7256-468b-a779-ce1dd6daedd7/console.log" append="off"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target type="isa-serial" port="0">
Oct 01 14:24:16 compute-0 nova_compute[192698]:         <model name="isa-serial"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       </target>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </serial>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <console type="pty">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <log file="/var/lib/nova/instances/a8a45d3f-7256-468b-a779-ce1dd6daedd7/console.log" append="off"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target type="serial" port="0"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </console>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <input type="tablet" bus="usb">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="usb" bus="0" port="1"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </input>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <input type="mouse" bus="ps2"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <listen type="address" address="::"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </graphics>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <video>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model type="virtio" heads="1" primary="yes"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </video>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <stats period="10"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </memballoon>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <rng model="virtio">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <backend model="random">/dev/urandom</backend>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </rng>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   </devices>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]: </domain>
Oct 01 14:24:16 compute-0 nova_compute[192698]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Oct 01 14:24:16 compute-0 nova_compute[192698]: 2025-10-01 14:24:16.282 2 DEBUG nova.virt.libvirt.migration [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Oct 01 14:24:16 compute-0 nova_compute[192698]:   <name>instance-00000017</name>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   <uuid>a8a45d3f-7256-468b-a779-ce1dd6daedd7</uuid>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   <metadata>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <nova:name>tempest-TestExecuteStrategies-server-901157653</nova:name>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <nova:creationTime>2025-10-01 14:23:19</nova:creationTime>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <nova:flavor name="m1.nano" id="69702c4b-38f2-49d1-96d5-85671652c67e">
Oct 01 14:24:16 compute-0 nova_compute[192698]:         <nova:memory>128</nova:memory>
Oct 01 14:24:16 compute-0 nova_compute[192698]:         <nova:disk>1</nova:disk>
Oct 01 14:24:16 compute-0 nova_compute[192698]:         <nova:swap>0</nova:swap>
Oct 01 14:24:16 compute-0 nova_compute[192698]:         <nova:ephemeral>0</nova:ephemeral>
Oct 01 14:24:16 compute-0 nova_compute[192698]:         <nova:vcpus>1</nova:vcpus>
Oct 01 14:24:16 compute-0 nova_compute[192698]:         <nova:extraSpecs>
Oct 01 14:24:16 compute-0 nova_compute[192698]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Oct 01 14:24:16 compute-0 nova_compute[192698]:         </nova:extraSpecs>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       </nova:flavor>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <nova:image uuid="48696e9b-a20d-4bf6-8ac2-6438fe748ab6">
Oct 01 14:24:16 compute-0 nova_compute[192698]:         <nova:containerFormat>bare</nova:containerFormat>
Oct 01 14:24:16 compute-0 nova_compute[192698]:         <nova:diskFormat>qcow2</nova:diskFormat>
Oct 01 14:24:16 compute-0 nova_compute[192698]:         <nova:minDisk>1</nova:minDisk>
Oct 01 14:24:16 compute-0 nova_compute[192698]:         <nova:minRam>0</nova:minRam>
Oct 01 14:24:16 compute-0 nova_compute[192698]:         <nova:properties>
Oct 01 14:24:16 compute-0 nova_compute[192698]:           <nova:property name="hw_rng_model">virtio</nova:property>
Oct 01 14:24:16 compute-0 nova_compute[192698]:         </nova:properties>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       </nova:image>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <nova:owner>
Oct 01 14:24:16 compute-0 nova_compute[192698]:         <nova:user uuid="f8897741e6ca4770b56d28d05fa3fc42">tempest-TestExecuteStrategies-30131345-project-admin</nova:user>
Oct 01 14:24:16 compute-0 nova_compute[192698]:         <nova:project uuid="d43115e3729442e1b68b749acc0dabc8">tempest-TestExecuteStrategies-30131345</nova:project>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       </nova:owner>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <nova:root type="image" uuid="48696e9b-a20d-4bf6-8ac2-6438fe748ab6"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <nova:ports>
Oct 01 14:24:16 compute-0 nova_compute[192698]:         <nova:port uuid="e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc">
Oct 01 14:24:16 compute-0 nova_compute[192698]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:         </nova:port>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       </nova:ports>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </nova:instance>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   </metadata>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   <memory unit="KiB">131072</memory>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   <currentMemory unit="KiB">131072</currentMemory>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   <vcpu placement="static">1</vcpu>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   <resource>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <partition>/machine</partition>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   </resource>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   <sysinfo type="smbios">
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <system>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <entry name="manufacturer">RDO</entry>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <entry name="product">OpenStack Compute</entry>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <entry name="serial">a8a45d3f-7256-468b-a779-ce1dd6daedd7</entry>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <entry name="uuid">a8a45d3f-7256-468b-a779-ce1dd6daedd7</entry>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <entry name="family">Virtual Machine</entry>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </system>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   </sysinfo>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   <os>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <boot dev="hd"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <smbios mode="sysinfo"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   </os>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   <features>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <acpi/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <apic/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <vmcoreinfo state="on"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   </features>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   <cpu mode="host-model" check="partial">
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   </cpu>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   <clock offset="utc">
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <timer name="pit" tickpolicy="delay"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <timer name="hpet" present="no"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   </clock>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   <on_poweroff>destroy</on_poweroff>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   <on_reboot>restart</on_reboot>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   <on_crash>destroy</on_crash>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   <devices>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <disk type="file" device="disk">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <source file="/var/lib/nova/instances/a8a45d3f-7256-468b-a779-ce1dd6daedd7/disk"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target dev="vda" bus="virtio"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </disk>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <disk type="file" device="cdrom">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <driver name="qemu" type="raw" cache="none"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <source file="/var/lib/nova/instances/a8a45d3f-7256-468b-a779-ce1dd6daedd7/disk.config"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target dev="sda" bus="sata"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <readonly/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </disk>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="0" model="pcie-root"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="1" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="1" port="0x10"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="2" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="2" port="0x11"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="3" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="3" port="0x12"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="4" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="4" port="0x13"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="5" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="5" port="0x14"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="6" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="6" port="0x15"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="7" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="7" port="0x16"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="8" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="8" port="0x17"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="9" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="9" port="0x18"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="10" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="10" port="0x19"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="11" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="11" port="0x1a"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="12" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="12" port="0x1b"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="13" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="13" port="0x1c"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="14" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="14" port="0x1d"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="15" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="15" port="0x1e"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="16" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="16" port="0x1f"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="17" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="17" port="0x20"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="18" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="18" port="0x21"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="19" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="19" port="0x22"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="20" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="20" port="0x23"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="21" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="21" port="0x24"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="22" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="22" port="0x25"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="23" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="23" port="0x26"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="24" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="24" port="0x27"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="25" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="25" port="0x28"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-pci-bridge"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="usb" index="0" model="piix3-uhci">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="sata" index="0">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <interface type="ethernet"><mac address="fa:16:3e:40:6e:59"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tape1187b3a-50"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </interface><serial type="pty">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <log file="/var/lib/nova/instances/a8a45d3f-7256-468b-a779-ce1dd6daedd7/console.log" append="off"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target type="isa-serial" port="0">
Oct 01 14:24:16 compute-0 nova_compute[192698]:         <model name="isa-serial"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       </target>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </serial>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <console type="pty">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <log file="/var/lib/nova/instances/a8a45d3f-7256-468b-a779-ce1dd6daedd7/console.log" append="off"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target type="serial" port="0"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </console>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <input type="tablet" bus="usb">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="usb" bus="0" port="1"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </input>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <input type="mouse" bus="ps2"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <listen type="address" address="::"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </graphics>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <video>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model type="virtio" heads="1" primary="yes"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </video>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <stats period="10"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </memballoon>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <rng model="virtio">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <backend model="random">/dev/urandom</backend>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </rng>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   </devices>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]: </domain>
Oct 01 14:24:16 compute-0 nova_compute[192698]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Oct 01 14:24:16 compute-0 nova_compute[192698]: 2025-10-01 14:24:16.283 2 DEBUG nova.virt.libvirt.migration [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] _update_pci_xml output xml=<domain type="kvm">
Oct 01 14:24:16 compute-0 nova_compute[192698]:   <name>instance-00000017</name>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   <uuid>a8a45d3f-7256-468b-a779-ce1dd6daedd7</uuid>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   <metadata>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <nova:name>tempest-TestExecuteStrategies-server-901157653</nova:name>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <nova:creationTime>2025-10-01 14:23:19</nova:creationTime>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <nova:flavor name="m1.nano" id="69702c4b-38f2-49d1-96d5-85671652c67e">
Oct 01 14:24:16 compute-0 nova_compute[192698]:         <nova:memory>128</nova:memory>
Oct 01 14:24:16 compute-0 nova_compute[192698]:         <nova:disk>1</nova:disk>
Oct 01 14:24:16 compute-0 nova_compute[192698]:         <nova:swap>0</nova:swap>
Oct 01 14:24:16 compute-0 nova_compute[192698]:         <nova:ephemeral>0</nova:ephemeral>
Oct 01 14:24:16 compute-0 nova_compute[192698]:         <nova:vcpus>1</nova:vcpus>
Oct 01 14:24:16 compute-0 nova_compute[192698]:         <nova:extraSpecs>
Oct 01 14:24:16 compute-0 nova_compute[192698]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Oct 01 14:24:16 compute-0 nova_compute[192698]:         </nova:extraSpecs>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       </nova:flavor>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <nova:image uuid="48696e9b-a20d-4bf6-8ac2-6438fe748ab6">
Oct 01 14:24:16 compute-0 nova_compute[192698]:         <nova:containerFormat>bare</nova:containerFormat>
Oct 01 14:24:16 compute-0 nova_compute[192698]:         <nova:diskFormat>qcow2</nova:diskFormat>
Oct 01 14:24:16 compute-0 nova_compute[192698]:         <nova:minDisk>1</nova:minDisk>
Oct 01 14:24:16 compute-0 nova_compute[192698]:         <nova:minRam>0</nova:minRam>
Oct 01 14:24:16 compute-0 nova_compute[192698]:         <nova:properties>
Oct 01 14:24:16 compute-0 nova_compute[192698]:           <nova:property name="hw_rng_model">virtio</nova:property>
Oct 01 14:24:16 compute-0 nova_compute[192698]:         </nova:properties>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       </nova:image>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <nova:owner>
Oct 01 14:24:16 compute-0 nova_compute[192698]:         <nova:user uuid="f8897741e6ca4770b56d28d05fa3fc42">tempest-TestExecuteStrategies-30131345-project-admin</nova:user>
Oct 01 14:24:16 compute-0 nova_compute[192698]:         <nova:project uuid="d43115e3729442e1b68b749acc0dabc8">tempest-TestExecuteStrategies-30131345</nova:project>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       </nova:owner>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <nova:root type="image" uuid="48696e9b-a20d-4bf6-8ac2-6438fe748ab6"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <nova:ports>
Oct 01 14:24:16 compute-0 nova_compute[192698]:         <nova:port uuid="e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc">
Oct 01 14:24:16 compute-0 nova_compute[192698]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:         </nova:port>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       </nova:ports>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </nova:instance>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   </metadata>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   <memory unit="KiB">131072</memory>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   <currentMemory unit="KiB">131072</currentMemory>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   <vcpu placement="static">1</vcpu>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   <resource>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <partition>/machine</partition>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   </resource>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   <sysinfo type="smbios">
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <system>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <entry name="manufacturer">RDO</entry>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <entry name="product">OpenStack Compute</entry>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <entry name="serial">a8a45d3f-7256-468b-a779-ce1dd6daedd7</entry>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <entry name="uuid">a8a45d3f-7256-468b-a779-ce1dd6daedd7</entry>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <entry name="family">Virtual Machine</entry>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </system>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   </sysinfo>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   <os>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <boot dev="hd"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <smbios mode="sysinfo"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   </os>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   <features>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <acpi/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <apic/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <vmcoreinfo state="on"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   </features>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   <cpu mode="host-model" check="partial">
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   </cpu>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   <clock offset="utc">
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <timer name="pit" tickpolicy="delay"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <timer name="hpet" present="no"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   </clock>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   <on_poweroff>destroy</on_poweroff>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   <on_reboot>restart</on_reboot>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   <on_crash>destroy</on_crash>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   <devices>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <disk type="file" device="disk">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <source file="/var/lib/nova/instances/a8a45d3f-7256-468b-a779-ce1dd6daedd7/disk"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target dev="vda" bus="virtio"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </disk>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <disk type="file" device="cdrom">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <driver name="qemu" type="raw" cache="none"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <source file="/var/lib/nova/instances/a8a45d3f-7256-468b-a779-ce1dd6daedd7/disk.config"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target dev="sda" bus="sata"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <readonly/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </disk>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="0" model="pcie-root"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="1" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="1" port="0x10"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="2" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="2" port="0x11"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="3" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="3" port="0x12"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="4" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="4" port="0x13"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="5" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="5" port="0x14"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="6" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="6" port="0x15"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="7" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="7" port="0x16"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="8" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="8" port="0x17"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="9" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="9" port="0x18"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="10" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="10" port="0x19"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="11" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="11" port="0x1a"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="12" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="12" port="0x1b"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="13" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="13" port="0x1c"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="14" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="14" port="0x1d"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="15" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="15" port="0x1e"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="16" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="16" port="0x1f"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="17" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="17" port="0x20"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="18" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="18" port="0x21"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="19" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="19" port="0x22"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="20" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="20" port="0x23"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="21" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="21" port="0x24"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="22" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="22" port="0x25"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="23" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="23" port="0x26"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="24" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="24" port="0x27"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="25" model="pcie-root-port">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-root-port"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target chassis="25" port="0x28"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model name="pcie-pci-bridge"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="usb" index="0" model="piix3-uhci">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <controller type="sata" index="0">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </controller>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <interface type="ethernet"><mac address="fa:16:3e:40:6e:59"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tape1187b3a-50"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </interface><serial type="pty">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <log file="/var/lib/nova/instances/a8a45d3f-7256-468b-a779-ce1dd6daedd7/console.log" append="off"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target type="isa-serial" port="0">
Oct 01 14:24:16 compute-0 nova_compute[192698]:         <model name="isa-serial"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       </target>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </serial>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <console type="pty">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <log file="/var/lib/nova/instances/a8a45d3f-7256-468b-a779-ce1dd6daedd7/console.log" append="off"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <target type="serial" port="0"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </console>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <input type="tablet" bus="usb">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="usb" bus="0" port="1"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </input>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <input type="mouse" bus="ps2"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <listen type="address" address="::"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </graphics>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <video>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <model type="virtio" heads="1" primary="yes"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </video>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <stats period="10"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </memballoon>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     <rng model="virtio">
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <backend model="random">/dev/urandom</backend>
Oct 01 14:24:16 compute-0 nova_compute[192698]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]:     </rng>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   </devices>
Oct 01 14:24:16 compute-0 nova_compute[192698]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Oct 01 14:24:16 compute-0 nova_compute[192698]: </domain>
Oct 01 14:24:16 compute-0 nova_compute[192698]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Oct 01 14:24:16 compute-0 nova_compute[192698]: 2025-10-01 14:24:16.284 2 DEBUG nova.virt.libvirt.driver [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Oct 01 14:24:16 compute-0 nova_compute[192698]: 2025-10-01 14:24:16.286 2 WARNING neutronclient.v2_0.client [req-3a6a3b16-8fa9-48e2-a595-31960d753150 req-1b9f9fb5-1630-436b-a0f0-af01970b4257 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:24:16 compute-0 nova_compute[192698]: 2025-10-01 14:24:16.445 2 DEBUG nova.network.neutron [req-3a6a3b16-8fa9-48e2-a595-31960d753150 req-1b9f9fb5-1630-436b-a0f0-af01970b4257 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Updated VIF entry in instance network info cache for port e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Oct 01 14:24:16 compute-0 nova_compute[192698]: 2025-10-01 14:24:16.445 2 DEBUG nova.network.neutron [req-3a6a3b16-8fa9-48e2-a595-31960d753150 req-1b9f9fb5-1630-436b-a0f0-af01970b4257 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Updating instance_info_cache with network_info: [{"id": "e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc", "address": "fa:16:3e:40:6e:59", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1187b3a-50", "ovs_interfaceid": "e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:24:16 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:24:16.639 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=10cf9814-09fa-4bad-879a-270f9b64eda3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:24:16 compute-0 nova_compute[192698]: 2025-10-01 14:24:16.774 2 DEBUG nova.virt.libvirt.migration [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Current None elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Oct 01 14:24:16 compute-0 nova_compute[192698]: 2025-10-01 14:24:16.775 2 INFO nova.virt.libvirt.migration [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Increasing downtime to 50 ms after 0 sec elapsed time
Oct 01 14:24:16 compute-0 nova_compute[192698]: 2025-10-01 14:24:16.953 2 DEBUG oslo_concurrency.lockutils [req-3a6a3b16-8fa9-48e2-a595-31960d753150 req-1b9f9fb5-1630-436b-a0f0-af01970b4257 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Releasing lock "refresh_cache-a8a45d3f-7256-468b-a779-ce1dd6daedd7" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 01 14:24:17 compute-0 nova_compute[192698]: 2025-10-01 14:24:17.792 2 INFO nova.virt.libvirt.driver [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Migration running for 1 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Oct 01 14:24:18 compute-0 nova_compute[192698]: 2025-10-01 14:24:18.296 2 DEBUG nova.virt.libvirt.migration [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Oct 01 14:24:18 compute-0 nova_compute[192698]: 2025-10-01 14:24:18.297 2 DEBUG nova.virt.libvirt.migration [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Oct 01 14:24:18 compute-0 kernel: tape1187b3a-50 (unregistering): left promiscuous mode
Oct 01 14:24:18 compute-0 NetworkManager[51741]: <info>  [1759328658.7369] device (tape1187b3a-50): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 01 14:24:18 compute-0 nova_compute[192698]: 2025-10-01 14:24:18.748 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:24:18 compute-0 ovn_controller[94909]: 2025-10-01T14:24:18Z|00186|binding|INFO|Releasing lport e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc from this chassis (sb_readonly=0)
Oct 01 14:24:18 compute-0 ovn_controller[94909]: 2025-10-01T14:24:18Z|00187|binding|INFO|Setting lport e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc down in Southbound
Oct 01 14:24:18 compute-0 ovn_controller[94909]: 2025-10-01T14:24:18Z|00188|binding|INFO|Removing iface tape1187b3a-50 ovn-installed in OVS
Oct 01 14:24:18 compute-0 nova_compute[192698]: 2025-10-01 14:24:18.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:24:18 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:24:18.754 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:40:6e:59 10.100.0.5'], port_security=['fa:16:3e:40:6e:59 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'd71f76a2-379d-402b-b590-797cbe777099'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'a8a45d3f-7256-468b-a779-ce1dd6daedd7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-031a8987-8430-4fb6-a464-01e4dca2fae7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd43115e3729442e1b68b749acc0dabc8', 'neutron:revision_number': '10', 'neutron:security_group_ids': '43a3232d-93b1-43af-a9a3-1fde49b4460d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bd1914da-f1b0-4097-9d6b-24a3870871dc, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], logical_port=e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:24:18 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:24:18.755 103791 INFO neutron.agent.ovn.metadata.agent [-] Port e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc in datapath 031a8987-8430-4fb6-a464-01e4dca2fae7 unbound from our chassis
Oct 01 14:24:18 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:24:18.756 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 031a8987-8430-4fb6-a464-01e4dca2fae7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 01 14:24:18 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:24:18.757 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[7d4bbc72-7e1a-41de-a7e7-069f7c80d153]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:24:18 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:24:18.758 103791 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7 namespace which is not needed anymore
Oct 01 14:24:18 compute-0 nova_compute[192698]: 2025-10-01 14:24:18.783 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:24:18 compute-0 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000017.scope: Deactivated successfully.
Oct 01 14:24:18 compute-0 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000017.scope: Consumed 16.207s CPU time.
Oct 01 14:24:18 compute-0 systemd-machined[152704]: Machine qemu-17-instance-00000017 terminated.
Oct 01 14:24:18 compute-0 podman[224406]: 2025-10-01 14:24:18.856077329 +0000 UTC m=+0.082828915 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS)
Oct 01 14:24:18 compute-0 podman[224409]: 2025-10-01 14:24:18.883088217 +0000 UTC m=+0.110279025 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller)
Oct 01 14:24:18 compute-0 neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7[224158]: [NOTICE]   (224162) : haproxy version is 3.0.5-8e879a5
Oct 01 14:24:18 compute-0 podman[224469]: 2025-10-01 14:24:18.925189712 +0000 UTC m=+0.039466325 container kill fd774786fef49abb715a1037cd7521383ab0b17cf9ed98ee91bf81ce9fb8819d (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Oct 01 14:24:18 compute-0 neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7[224158]: [NOTICE]   (224162) : path to executable is /usr/sbin/haproxy
Oct 01 14:24:18 compute-0 neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7[224158]: [WARNING]  (224162) : Exiting Master process...
Oct 01 14:24:18 compute-0 neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7[224158]: [ALERT]    (224162) : Current worker (224164) exited with code 143 (Terminated)
Oct 01 14:24:18 compute-0 neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7[224158]: [WARNING]  (224162) : All workers exited. Exiting... (0)
Oct 01 14:24:18 compute-0 systemd[1]: libpod-fd774786fef49abb715a1037cd7521383ab0b17cf9ed98ee91bf81ce9fb8819d.scope: Deactivated successfully.
Oct 01 14:24:18 compute-0 podman[224485]: 2025-10-01 14:24:18.987533764 +0000 UTC m=+0.036109815 container died fd774786fef49abb715a1037cd7521383ab0b17cf9ed98ee91bf81ce9fb8819d (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Oct 01 14:24:18 compute-0 nova_compute[192698]: 2025-10-01 14:24:18.996 2 DEBUG nova.virt.libvirt.guest [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Oct 01 14:24:18 compute-0 nova_compute[192698]: 2025-10-01 14:24:18.997 2 INFO nova.virt.libvirt.driver [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Migration operation has completed
Oct 01 14:24:18 compute-0 nova_compute[192698]: 2025-10-01 14:24:18.998 2 INFO nova.compute.manager [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] _post_live_migration() is started..
Oct 01 14:24:19 compute-0 nova_compute[192698]: 2025-10-01 14:24:19.002 2 DEBUG nova.virt.libvirt.driver [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Migrate API has completed _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11182
Oct 01 14:24:19 compute-0 nova_compute[192698]: 2025-10-01 14:24:19.002 2 DEBUG nova.virt.libvirt.driver [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Migration operation thread has finished _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11230
Oct 01 14:24:19 compute-0 nova_compute[192698]: 2025-10-01 14:24:19.003 2 DEBUG nova.virt.libvirt.driver [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Oct 01 14:24:19 compute-0 nova_compute[192698]: 2025-10-01 14:24:19.021 2 WARNING neutronclient.v2_0.client [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:24:19 compute-0 nova_compute[192698]: 2025-10-01 14:24:19.022 2 WARNING neutronclient.v2_0.client [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:24:19 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fd774786fef49abb715a1037cd7521383ab0b17cf9ed98ee91bf81ce9fb8819d-userdata-shm.mount: Deactivated successfully.
Oct 01 14:24:19 compute-0 systemd[1]: var-lib-containers-storage-overlay-8ffa25a42151f3594e954d034789d64bd4bb58679cb199098bcfa8a4b2295d64-merged.mount: Deactivated successfully.
Oct 01 14:24:19 compute-0 podman[224485]: 2025-10-01 14:24:19.043994567 +0000 UTC m=+0.092570628 container cleanup fd774786fef49abb715a1037cd7521383ab0b17cf9ed98ee91bf81ce9fb8819d (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Oct 01 14:24:19 compute-0 systemd[1]: libpod-conmon-fd774786fef49abb715a1037cd7521383ab0b17cf9ed98ee91bf81ce9fb8819d.scope: Deactivated successfully.
Oct 01 14:24:19 compute-0 podman[224491]: 2025-10-01 14:24:19.064320225 +0000 UTC m=+0.100595884 container remove fd774786fef49abb715a1037cd7521383ab0b17cf9ed98ee91bf81ce9fb8819d (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 01 14:24:19 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:24:19.072 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[5b731a05-178c-415d-8438-20396d5cee71]: (4, ("Wed Oct  1 02:24:18 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7 (fd774786fef49abb715a1037cd7521383ab0b17cf9ed98ee91bf81ce9fb8819d)\nfd774786fef49abb715a1037cd7521383ab0b17cf9ed98ee91bf81ce9fb8819d\nWed Oct  1 02:24:18 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7 (fd774786fef49abb715a1037cd7521383ab0b17cf9ed98ee91bf81ce9fb8819d)\nfd774786fef49abb715a1037cd7521383ab0b17cf9ed98ee91bf81ce9fb8819d\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:24:19 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:24:19.074 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[c07a4b16-2d57-4f5a-85ff-308684e5213c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:24:19 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:24:19.074 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:24:19 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:24:19.075 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[46904fb2-0e31-4aed-840e-21cbc9c819aa]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:24:19 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:24:19.077 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap031a8987-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:24:19 compute-0 nova_compute[192698]: 2025-10-01 14:24:19.080 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:24:19 compute-0 kernel: tap031a8987-80: left promiscuous mode
Oct 01 14:24:19 compute-0 nova_compute[192698]: 2025-10-01 14:24:19.110 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:24:19 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:24:19.114 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[43d96fdb-8247-4014-b997-4b8de5c7b582]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:24:19 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:24:19.151 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[4c93899f-9880-4d60-978f-aa592bd03be7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:24:19 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:24:19.153 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[e07252a8-c642-4ad5-bc70-a20dda5ffc26]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:24:19 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:24:19.175 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[3bd002f2-49b0-44f5-af3f-bd9910d2336b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 496431, 'reachable_time': 29741, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224535, 'error': None, 'target': 'ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:24:19 compute-0 systemd[1]: run-netns-ovnmeta\x2d031a8987\x2d8430\x2d4fb6\x2da464\x2d01e4dca2fae7.mount: Deactivated successfully.
Oct 01 14:24:19 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:24:19.178 103910 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Oct 01 14:24:19 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:24:19.179 103910 DEBUG oslo.privsep.daemon [-] privsep: reply[f1b7a72e-5fd2-4f63-b765-588c30fe751c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:24:19 compute-0 nova_compute[192698]: 2025-10-01 14:24:19.358 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:24:19 compute-0 nova_compute[192698]: 2025-10-01 14:24:19.468 2 DEBUG nova.compute.manager [req-a89b892c-c11d-4350-92d7-88acf1d5a675 req-4db09c13-952c-4028-b5b5-c96b4fc7b073 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Received event network-vif-unplugged-e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:24:19 compute-0 nova_compute[192698]: 2025-10-01 14:24:19.470 2 DEBUG oslo_concurrency.lockutils [req-a89b892c-c11d-4350-92d7-88acf1d5a675 req-4db09c13-952c-4028-b5b5-c96b4fc7b073 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "a8a45d3f-7256-468b-a779-ce1dd6daedd7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:24:19 compute-0 nova_compute[192698]: 2025-10-01 14:24:19.470 2 DEBUG oslo_concurrency.lockutils [req-a89b892c-c11d-4350-92d7-88acf1d5a675 req-4db09c13-952c-4028-b5b5-c96b4fc7b073 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "a8a45d3f-7256-468b-a779-ce1dd6daedd7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:24:19 compute-0 nova_compute[192698]: 2025-10-01 14:24:19.471 2 DEBUG oslo_concurrency.lockutils [req-a89b892c-c11d-4350-92d7-88acf1d5a675 req-4db09c13-952c-4028-b5b5-c96b4fc7b073 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "a8a45d3f-7256-468b-a779-ce1dd6daedd7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:24:19 compute-0 nova_compute[192698]: 2025-10-01 14:24:19.471 2 DEBUG nova.compute.manager [req-a89b892c-c11d-4350-92d7-88acf1d5a675 req-4db09c13-952c-4028-b5b5-c96b4fc7b073 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] No waiting events found dispatching network-vif-unplugged-e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:24:19 compute-0 nova_compute[192698]: 2025-10-01 14:24:19.471 2 DEBUG nova.compute.manager [req-a89b892c-c11d-4350-92d7-88acf1d5a675 req-4db09c13-952c-4028-b5b5-c96b4fc7b073 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Received event network-vif-unplugged-e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 01 14:24:19 compute-0 nova_compute[192698]: 2025-10-01 14:24:19.657 2 DEBUG nova.compute.manager [req-ec91b61c-f199-4456-8cfe-1c7d48655db1 req-ac42c427-ac63-41eb-9c40-78a78fe41c9a 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Received event network-vif-unplugged-e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:24:19 compute-0 nova_compute[192698]: 2025-10-01 14:24:19.658 2 DEBUG oslo_concurrency.lockutils [req-ec91b61c-f199-4456-8cfe-1c7d48655db1 req-ac42c427-ac63-41eb-9c40-78a78fe41c9a 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "a8a45d3f-7256-468b-a779-ce1dd6daedd7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:24:19 compute-0 nova_compute[192698]: 2025-10-01 14:24:19.659 2 DEBUG oslo_concurrency.lockutils [req-ec91b61c-f199-4456-8cfe-1c7d48655db1 req-ac42c427-ac63-41eb-9c40-78a78fe41c9a 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "a8a45d3f-7256-468b-a779-ce1dd6daedd7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:24:19 compute-0 nova_compute[192698]: 2025-10-01 14:24:19.659 2 DEBUG oslo_concurrency.lockutils [req-ec91b61c-f199-4456-8cfe-1c7d48655db1 req-ac42c427-ac63-41eb-9c40-78a78fe41c9a 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "a8a45d3f-7256-468b-a779-ce1dd6daedd7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:24:19 compute-0 nova_compute[192698]: 2025-10-01 14:24:19.660 2 DEBUG nova.compute.manager [req-ec91b61c-f199-4456-8cfe-1c7d48655db1 req-ac42c427-ac63-41eb-9c40-78a78fe41c9a 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] No waiting events found dispatching network-vif-unplugged-e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:24:19 compute-0 nova_compute[192698]: 2025-10-01 14:24:19.660 2 DEBUG nova.compute.manager [req-ec91b61c-f199-4456-8cfe-1c7d48655db1 req-ac42c427-ac63-41eb-9c40-78a78fe41c9a 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Received event network-vif-unplugged-e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 01 14:24:20 compute-0 nova_compute[192698]: 2025-10-01 14:24:20.462 2 DEBUG nova.network.neutron [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Activated binding for port e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Oct 01 14:24:20 compute-0 nova_compute[192698]: 2025-10-01 14:24:20.464 2 DEBUG nova.compute.manager [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc", "address": "fa:16:3e:40:6e:59", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1187b3a-50", "ovs_interfaceid": "e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10059
Oct 01 14:24:20 compute-0 nova_compute[192698]: 2025-10-01 14:24:20.465 2 DEBUG nova.virt.libvirt.vif [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-10-01T14:23:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-901157653',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-901157653',id=23,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-01T14:23:26Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d43115e3729442e1b68b749acc0dabc8',ramdisk_id='',reservation_id='r-bsln7qq4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-30131345',owner_user_name='tempest-TestExecuteStrategies-30131345-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-01T14:23:57Z,user_data=None,user_id='f8897741e6ca4770b56d28d05fa3fc42',uuid=a8a45d3f-7256-468b-a779-ce1dd6daedd7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc", "address": "fa:16:3e:40:6e:59", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1187b3a-50", "ovs_interfaceid": "e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 01 14:24:20 compute-0 nova_compute[192698]: 2025-10-01 14:24:20.466 2 DEBUG nova.network.os_vif_util [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Converting VIF {"id": "e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc", "address": "fa:16:3e:40:6e:59", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1187b3a-50", "ovs_interfaceid": "e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:24:20 compute-0 nova_compute[192698]: 2025-10-01 14:24:20.467 2 DEBUG nova.network.os_vif_util [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:40:6e:59,bridge_name='br-int',has_traffic_filtering=True,id=e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape1187b3a-50') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:24:20 compute-0 nova_compute[192698]: 2025-10-01 14:24:20.468 2 DEBUG os_vif [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:40:6e:59,bridge_name='br-int',has_traffic_filtering=True,id=e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape1187b3a-50') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 01 14:24:20 compute-0 nova_compute[192698]: 2025-10-01 14:24:20.471 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:24:20 compute-0 nova_compute[192698]: 2025-10-01 14:24:20.471 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape1187b3a-50, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:24:20 compute-0 nova_compute[192698]: 2025-10-01 14:24:20.473 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:24:20 compute-0 nova_compute[192698]: 2025-10-01 14:24:20.476 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:24:20 compute-0 nova_compute[192698]: 2025-10-01 14:24:20.478 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:24:20 compute-0 nova_compute[192698]: 2025-10-01 14:24:20.478 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=0a51fd75-d2d3-4358-a2ce-263bc5a37581) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:24:20 compute-0 nova_compute[192698]: 2025-10-01 14:24:20.479 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:24:20 compute-0 nova_compute[192698]: 2025-10-01 14:24:20.481 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:24:20 compute-0 nova_compute[192698]: 2025-10-01 14:24:20.484 2 INFO os_vif [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:40:6e:59,bridge_name='br-int',has_traffic_filtering=True,id=e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape1187b3a-50')
Oct 01 14:24:20 compute-0 nova_compute[192698]: 2025-10-01 14:24:20.484 2 DEBUG oslo_concurrency.lockutils [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:24:20 compute-0 nova_compute[192698]: 2025-10-01 14:24:20.485 2 DEBUG oslo_concurrency.lockutils [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:24:20 compute-0 nova_compute[192698]: 2025-10-01 14:24:20.485 2 DEBUG oslo_concurrency.lockutils [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:24:20 compute-0 nova_compute[192698]: 2025-10-01 14:24:20.486 2 DEBUG nova.compute.manager [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10082
Oct 01 14:24:20 compute-0 nova_compute[192698]: 2025-10-01 14:24:20.486 2 INFO nova.virt.libvirt.driver [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Deleting instance files /var/lib/nova/instances/a8a45d3f-7256-468b-a779-ce1dd6daedd7_del
Oct 01 14:24:20 compute-0 nova_compute[192698]: 2025-10-01 14:24:20.488 2 INFO nova.virt.libvirt.driver [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Deletion of /var/lib/nova/instances/a8a45d3f-7256-468b-a779-ce1dd6daedd7_del complete
Oct 01 14:24:21 compute-0 nova_compute[192698]: 2025-10-01 14:24:21.522 2 DEBUG nova.compute.manager [req-7481997f-3762-473e-9a2a-4f4c3c56b9e0 req-fd4b4bf4-c842-475e-bf33-9f5ffb2e5aa8 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Received event network-vif-plugged-e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:24:21 compute-0 nova_compute[192698]: 2025-10-01 14:24:21.523 2 DEBUG oslo_concurrency.lockutils [req-7481997f-3762-473e-9a2a-4f4c3c56b9e0 req-fd4b4bf4-c842-475e-bf33-9f5ffb2e5aa8 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "a8a45d3f-7256-468b-a779-ce1dd6daedd7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:24:21 compute-0 nova_compute[192698]: 2025-10-01 14:24:21.524 2 DEBUG oslo_concurrency.lockutils [req-7481997f-3762-473e-9a2a-4f4c3c56b9e0 req-fd4b4bf4-c842-475e-bf33-9f5ffb2e5aa8 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "a8a45d3f-7256-468b-a779-ce1dd6daedd7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:24:21 compute-0 nova_compute[192698]: 2025-10-01 14:24:21.524 2 DEBUG oslo_concurrency.lockutils [req-7481997f-3762-473e-9a2a-4f4c3c56b9e0 req-fd4b4bf4-c842-475e-bf33-9f5ffb2e5aa8 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "a8a45d3f-7256-468b-a779-ce1dd6daedd7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:24:21 compute-0 nova_compute[192698]: 2025-10-01 14:24:21.525 2 DEBUG nova.compute.manager [req-7481997f-3762-473e-9a2a-4f4c3c56b9e0 req-fd4b4bf4-c842-475e-bf33-9f5ffb2e5aa8 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] No waiting events found dispatching network-vif-plugged-e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:24:21 compute-0 nova_compute[192698]: 2025-10-01 14:24:21.525 2 WARNING nova.compute.manager [req-7481997f-3762-473e-9a2a-4f4c3c56b9e0 req-fd4b4bf4-c842-475e-bf33-9f5ffb2e5aa8 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Received unexpected event network-vif-plugged-e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc for instance with vm_state active and task_state migrating.
Oct 01 14:24:21 compute-0 nova_compute[192698]: 2025-10-01 14:24:21.525 2 DEBUG nova.compute.manager [req-7481997f-3762-473e-9a2a-4f4c3c56b9e0 req-fd4b4bf4-c842-475e-bf33-9f5ffb2e5aa8 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Received event network-vif-unplugged-e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:24:21 compute-0 nova_compute[192698]: 2025-10-01 14:24:21.526 2 DEBUG oslo_concurrency.lockutils [req-7481997f-3762-473e-9a2a-4f4c3c56b9e0 req-fd4b4bf4-c842-475e-bf33-9f5ffb2e5aa8 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "a8a45d3f-7256-468b-a779-ce1dd6daedd7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:24:21 compute-0 nova_compute[192698]: 2025-10-01 14:24:21.526 2 DEBUG oslo_concurrency.lockutils [req-7481997f-3762-473e-9a2a-4f4c3c56b9e0 req-fd4b4bf4-c842-475e-bf33-9f5ffb2e5aa8 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "a8a45d3f-7256-468b-a779-ce1dd6daedd7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:24:21 compute-0 nova_compute[192698]: 2025-10-01 14:24:21.526 2 DEBUG oslo_concurrency.lockutils [req-7481997f-3762-473e-9a2a-4f4c3c56b9e0 req-fd4b4bf4-c842-475e-bf33-9f5ffb2e5aa8 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "a8a45d3f-7256-468b-a779-ce1dd6daedd7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:24:21 compute-0 nova_compute[192698]: 2025-10-01 14:24:21.527 2 DEBUG nova.compute.manager [req-7481997f-3762-473e-9a2a-4f4c3c56b9e0 req-fd4b4bf4-c842-475e-bf33-9f5ffb2e5aa8 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] No waiting events found dispatching network-vif-unplugged-e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:24:21 compute-0 nova_compute[192698]: 2025-10-01 14:24:21.527 2 DEBUG nova.compute.manager [req-7481997f-3762-473e-9a2a-4f4c3c56b9e0 req-fd4b4bf4-c842-475e-bf33-9f5ffb2e5aa8 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Received event network-vif-unplugged-e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 01 14:24:21 compute-0 nova_compute[192698]: 2025-10-01 14:24:21.527 2 DEBUG nova.compute.manager [req-7481997f-3762-473e-9a2a-4f4c3c56b9e0 req-fd4b4bf4-c842-475e-bf33-9f5ffb2e5aa8 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Received event network-vif-plugged-e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:24:21 compute-0 nova_compute[192698]: 2025-10-01 14:24:21.528 2 DEBUG oslo_concurrency.lockutils [req-7481997f-3762-473e-9a2a-4f4c3c56b9e0 req-fd4b4bf4-c842-475e-bf33-9f5ffb2e5aa8 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "a8a45d3f-7256-468b-a779-ce1dd6daedd7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:24:21 compute-0 nova_compute[192698]: 2025-10-01 14:24:21.528 2 DEBUG oslo_concurrency.lockutils [req-7481997f-3762-473e-9a2a-4f4c3c56b9e0 req-fd4b4bf4-c842-475e-bf33-9f5ffb2e5aa8 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "a8a45d3f-7256-468b-a779-ce1dd6daedd7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:24:21 compute-0 nova_compute[192698]: 2025-10-01 14:24:21.528 2 DEBUG oslo_concurrency.lockutils [req-7481997f-3762-473e-9a2a-4f4c3c56b9e0 req-fd4b4bf4-c842-475e-bf33-9f5ffb2e5aa8 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "a8a45d3f-7256-468b-a779-ce1dd6daedd7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:24:21 compute-0 nova_compute[192698]: 2025-10-01 14:24:21.528 2 DEBUG nova.compute.manager [req-7481997f-3762-473e-9a2a-4f4c3c56b9e0 req-fd4b4bf4-c842-475e-bf33-9f5ffb2e5aa8 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] No waiting events found dispatching network-vif-plugged-e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:24:21 compute-0 nova_compute[192698]: 2025-10-01 14:24:21.529 2 WARNING nova.compute.manager [req-7481997f-3762-473e-9a2a-4f4c3c56b9e0 req-fd4b4bf4-c842-475e-bf33-9f5ffb2e5aa8 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Received unexpected event network-vif-plugged-e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc for instance with vm_state active and task_state migrating.
Oct 01 14:24:23 compute-0 nova_compute[192698]: 2025-10-01 14:24:23.610 2 DEBUG nova.compute.manager [req-0e823d2d-3272-4135-8ab3-c1cd922fb386 req-18b7ceb0-c61e-4477-a884-3d571797c1af 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Received event network-vif-plugged-e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:24:23 compute-0 nova_compute[192698]: 2025-10-01 14:24:23.611 2 DEBUG oslo_concurrency.lockutils [req-0e823d2d-3272-4135-8ab3-c1cd922fb386 req-18b7ceb0-c61e-4477-a884-3d571797c1af 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "a8a45d3f-7256-468b-a779-ce1dd6daedd7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:24:23 compute-0 nova_compute[192698]: 2025-10-01 14:24:23.611 2 DEBUG oslo_concurrency.lockutils [req-0e823d2d-3272-4135-8ab3-c1cd922fb386 req-18b7ceb0-c61e-4477-a884-3d571797c1af 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "a8a45d3f-7256-468b-a779-ce1dd6daedd7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:24:23 compute-0 nova_compute[192698]: 2025-10-01 14:24:23.612 2 DEBUG oslo_concurrency.lockutils [req-0e823d2d-3272-4135-8ab3-c1cd922fb386 req-18b7ceb0-c61e-4477-a884-3d571797c1af 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "a8a45d3f-7256-468b-a779-ce1dd6daedd7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:24:23 compute-0 nova_compute[192698]: 2025-10-01 14:24:23.612 2 DEBUG nova.compute.manager [req-0e823d2d-3272-4135-8ab3-c1cd922fb386 req-18b7ceb0-c61e-4477-a884-3d571797c1af 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] No waiting events found dispatching network-vif-plugged-e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:24:23 compute-0 nova_compute[192698]: 2025-10-01 14:24:23.612 2 WARNING nova.compute.manager [req-0e823d2d-3272-4135-8ab3-c1cd922fb386 req-18b7ceb0-c61e-4477-a884-3d571797c1af 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Received unexpected event network-vif-plugged-e1187b3a-5035-4f2d-bb9d-ca47b9fe1dfc for instance with vm_state active and task_state migrating.
Oct 01 14:24:24 compute-0 nova_compute[192698]: 2025-10-01 14:24:24.361 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:24:24 compute-0 nova_compute[192698]: 2025-10-01 14:24:24.433 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:24:24 compute-0 nova_compute[192698]: 2025-10-01 14:24:24.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:24:24 compute-0 nova_compute[192698]: 2025-10-01 14:24:24.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:24:25 compute-0 nova_compute[192698]: 2025-10-01 14:24:25.438 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:24:25 compute-0 nova_compute[192698]: 2025-10-01 14:24:25.439 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:24:25 compute-0 nova_compute[192698]: 2025-10-01 14:24:25.439 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:24:25 compute-0 nova_compute[192698]: 2025-10-01 14:24:25.439 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 01 14:24:25 compute-0 nova_compute[192698]: 2025-10-01 14:24:25.481 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:24:25 compute-0 podman[224538]: 2025-10-01 14:24:25.584905702 +0000 UTC m=+0.098885758 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, version=9.6, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., architecture=x86_64)
Oct 01 14:24:25 compute-0 nova_compute[192698]: 2025-10-01 14:24:25.652 2 WARNING nova.virt.libvirt.driver [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:24:25 compute-0 nova_compute[192698]: 2025-10-01 14:24:25.653 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:24:25 compute-0 nova_compute[192698]: 2025-10-01 14:24:25.672 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.019s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:24:25 compute-0 nova_compute[192698]: 2025-10-01 14:24:25.673 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5840MB free_disk=73.3029899597168GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 01 14:24:25 compute-0 nova_compute[192698]: 2025-10-01 14:24:25.673 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:24:25 compute-0 nova_compute[192698]: 2025-10-01 14:24:25.674 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:24:26 compute-0 nova_compute[192698]: 2025-10-01 14:24:26.698 2 INFO nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Updating resource usage from migration 56aa8204-2aa9-4162-80de-f3f81c7131b1
Oct 01 14:24:26 compute-0 nova_compute[192698]: 2025-10-01 14:24:26.731 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Migration 56aa8204-2aa9-4162-80de-f3f81c7131b1 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Oct 01 14:24:26 compute-0 nova_compute[192698]: 2025-10-01 14:24:26.731 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 01 14:24:26 compute-0 nova_compute[192698]: 2025-10-01 14:24:26.732 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:24:25 up  1:23,  0 user,  load average: 0.80, 0.38, 0.35\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_migrating': '1', 'num_os_type_None': '1', 'num_proj_d43115e3729442e1b68b749acc0dabc8': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 01 14:24:26 compute-0 nova_compute[192698]: 2025-10-01 14:24:26.896 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:24:27 compute-0 nova_compute[192698]: 2025-10-01 14:24:27.405 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:24:27 compute-0 nova_compute[192698]: 2025-10-01 14:24:27.915 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 01 14:24:27 compute-0 nova_compute[192698]: 2025-10-01 14:24:27.916 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.242s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:24:28 compute-0 nova_compute[192698]: 2025-10-01 14:24:28.917 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:24:28 compute-0 nova_compute[192698]: 2025-10-01 14:24:28.918 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:24:29 compute-0 nova_compute[192698]: 2025-10-01 14:24:29.362 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:24:29 compute-0 podman[203144]: time="2025-10-01T14:24:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:24:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:24:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:24:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:24:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3025 "" "Go-http-client/1.1"
Oct 01 14:24:29 compute-0 nova_compute[192698]: 2025-10-01 14:24:29.915 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:24:30 compute-0 nova_compute[192698]: 2025-10-01 14:24:30.046 2 DEBUG oslo_concurrency.lockutils [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "a8a45d3f-7256-468b-a779-ce1dd6daedd7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:24:30 compute-0 nova_compute[192698]: 2025-10-01 14:24:30.047 2 DEBUG oslo_concurrency.lockutils [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "a8a45d3f-7256-468b-a779-ce1dd6daedd7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:24:30 compute-0 nova_compute[192698]: 2025-10-01 14:24:30.047 2 DEBUG oslo_concurrency.lockutils [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "a8a45d3f-7256-468b-a779-ce1dd6daedd7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:24:30 compute-0 nova_compute[192698]: 2025-10-01 14:24:30.485 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:24:30 compute-0 nova_compute[192698]: 2025-10-01 14:24:30.561 2 DEBUG oslo_concurrency.lockutils [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:24:30 compute-0 nova_compute[192698]: 2025-10-01 14:24:30.562 2 DEBUG oslo_concurrency.lockutils [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:24:30 compute-0 nova_compute[192698]: 2025-10-01 14:24:30.563 2 DEBUG oslo_concurrency.lockutils [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:24:30 compute-0 nova_compute[192698]: 2025-10-01 14:24:30.563 2 DEBUG nova.compute.resource_tracker [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 01 14:24:30 compute-0 nova_compute[192698]: 2025-10-01 14:24:30.775 2 WARNING nova.virt.libvirt.driver [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:24:30 compute-0 nova_compute[192698]: 2025-10-01 14:24:30.777 2 DEBUG oslo_concurrency.processutils [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:24:30 compute-0 nova_compute[192698]: 2025-10-01 14:24:30.796 2 DEBUG oslo_concurrency.processutils [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.020s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:24:30 compute-0 nova_compute[192698]: 2025-10-01 14:24:30.797 2 DEBUG nova.compute.resource_tracker [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5852MB free_disk=73.3029899597168GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 01 14:24:30 compute-0 nova_compute[192698]: 2025-10-01 14:24:30.798 2 DEBUG oslo_concurrency.lockutils [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:24:30 compute-0 nova_compute[192698]: 2025-10-01 14:24:30.798 2 DEBUG oslo_concurrency.lockutils [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:24:31 compute-0 openstack_network_exporter[205307]: ERROR   14:24:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:24:31 compute-0 openstack_network_exporter[205307]: ERROR   14:24:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:24:31 compute-0 openstack_network_exporter[205307]: ERROR   14:24:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:24:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:24:31 compute-0 openstack_network_exporter[205307]: ERROR   14:24:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:24:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:24:31 compute-0 openstack_network_exporter[205307]: ERROR   14:24:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:24:31 compute-0 nova_compute[192698]: 2025-10-01 14:24:31.817 2 DEBUG nova.compute.resource_tracker [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Migration for instance a8a45d3f-7256-468b-a779-ce1dd6daedd7 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Oct 01 14:24:32 compute-0 podman[224563]: 2025-10-01 14:24:32.192954788 +0000 UTC m=+0.094362736 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct 01 14:24:32 compute-0 podman[224562]: 2025-10-01 14:24:32.215886807 +0000 UTC m=+0.119919656 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930)
Oct 01 14:24:32 compute-0 nova_compute[192698]: 2025-10-01 14:24:32.326 2 DEBUG nova.compute.resource_tracker [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Oct 01 14:24:32 compute-0 nova_compute[192698]: 2025-10-01 14:24:32.354 2 DEBUG nova.compute.resource_tracker [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Migration 56aa8204-2aa9-4162-80de-f3f81c7131b1 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Oct 01 14:24:32 compute-0 nova_compute[192698]: 2025-10-01 14:24:32.354 2 DEBUG nova.compute.resource_tracker [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 01 14:24:32 compute-0 nova_compute[192698]: 2025-10-01 14:24:32.355 2 DEBUG nova.compute.resource_tracker [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:24:30 up  1:23,  0 user,  load average: 0.73, 0.37, 0.35\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 01 14:24:32 compute-0 nova_compute[192698]: 2025-10-01 14:24:32.396 2 DEBUG nova.compute.provider_tree [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:24:32 compute-0 nova_compute[192698]: 2025-10-01 14:24:32.903 2 DEBUG nova.scheduler.client.report [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:24:33 compute-0 nova_compute[192698]: 2025-10-01 14:24:33.414 2 DEBUG nova.compute.resource_tracker [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 01 14:24:33 compute-0 nova_compute[192698]: 2025-10-01 14:24:33.415 2 DEBUG oslo_concurrency.lockutils [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.616s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:24:33 compute-0 nova_compute[192698]: 2025-10-01 14:24:33.434 2 INFO nova.compute.manager [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Oct 01 14:24:33 compute-0 nova_compute[192698]: 2025-10-01 14:24:33.924 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:24:33 compute-0 nova_compute[192698]: 2025-10-01 14:24:33.926 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:24:33 compute-0 nova_compute[192698]: 2025-10-01 14:24:33.926 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 01 14:24:34 compute-0 nova_compute[192698]: 2025-10-01 14:24:34.365 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:24:34 compute-0 nova_compute[192698]: 2025-10-01 14:24:34.519 2 INFO nova.scheduler.client.report [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Deleted allocation for migration 56aa8204-2aa9-4162-80de-f3f81c7131b1
Oct 01 14:24:34 compute-0 nova_compute[192698]: 2025-10-01 14:24:34.520 2 DEBUG nova.virt.libvirt.driver [None req-230125ba-aefe-4812-a3fa-16217a890d98 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: a8a45d3f-7256-468b-a779-ce1dd6daedd7] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Oct 01 14:24:35 compute-0 nova_compute[192698]: 2025-10-01 14:24:35.100 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:24:35 compute-0 nova_compute[192698]: 2025-10-01 14:24:35.488 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:24:38 compute-0 podman[224600]: 2025-10-01 14:24:38.176830212 +0000 UTC m=+0.089327374 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 01 14:24:39 compute-0 nova_compute[192698]: 2025-10-01 14:24:39.369 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:24:40 compute-0 nova_compute[192698]: 2025-10-01 14:24:40.492 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:24:44 compute-0 nova_compute[192698]: 2025-10-01 14:24:44.372 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:24:45 compute-0 nova_compute[192698]: 2025-10-01 14:24:45.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:24:47 compute-0 nova_compute[192698]: 2025-10-01 14:24:47.842 2 DEBUG nova.compute.manager [None req-74275bb3-9f50-4a82-8125-130ad03e0f52 1b0ba8d8c771490ab1005529976fdb7e 9dacac6049d34f02846f752af09ae16f - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider ee1e54f5-453b-4949-a499-9a192f03b8f0 in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:631
Oct 01 14:24:47 compute-0 nova_compute[192698]: 2025-10-01 14:24:47.907 2 DEBUG nova.compute.provider_tree [None req-74275bb3-9f50-4a82-8125-130ad03e0f52 1b0ba8d8c771490ab1005529976fdb7e 9dacac6049d34f02846f752af09ae16f - - default default] Updating resource provider ee1e54f5-453b-4949-a499-9a192f03b8f0 generation from 29 to 32 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Oct 01 14:24:49 compute-0 podman[224625]: 2025-10-01 14:24:49.156038891 +0000 UTC m=+0.069408918 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Oct 01 14:24:49 compute-0 podman[224626]: 2025-10-01 14:24:49.212609983 +0000 UTC m=+0.124669865 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Oct 01 14:24:49 compute-0 nova_compute[192698]: 2025-10-01 14:24:49.374 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:24:50 compute-0 nova_compute[192698]: 2025-10-01 14:24:50.498 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:24:54 compute-0 nova_compute[192698]: 2025-10-01 14:24:54.377 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:24:55 compute-0 nova_compute[192698]: 2025-10-01 14:24:55.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:24:56 compute-0 podman[224670]: 2025-10-01 14:24:56.147134381 +0000 UTC m=+0.062070651 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, io.openshift.expose-services=, release=1755695350, container_name=openstack_network_exporter, io.buildah.version=1.33.7, architecture=x86_64, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible)
Oct 01 14:24:59 compute-0 nova_compute[192698]: 2025-10-01 14:24:59.380 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:24:59 compute-0 podman[203144]: time="2025-10-01T14:24:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:24:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:24:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:24:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:24:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3025 "" "Go-http-client/1.1"
Oct 01 14:25:00 compute-0 nova_compute[192698]: 2025-10-01 14:25:00.505 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:25:01 compute-0 openstack_network_exporter[205307]: ERROR   14:25:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:25:01 compute-0 openstack_network_exporter[205307]: ERROR   14:25:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:25:01 compute-0 openstack_network_exporter[205307]: ERROR   14:25:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:25:01 compute-0 openstack_network_exporter[205307]: ERROR   14:25:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:25:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:25:01 compute-0 openstack_network_exporter[205307]: ERROR   14:25:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:25:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:25:03 compute-0 podman[224692]: 2025-10-01 14:25:03.220982926 +0000 UTC m=+0.123245987 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct 01 14:25:03 compute-0 podman[224691]: 2025-10-01 14:25:03.243358638 +0000 UTC m=+0.151917358 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, tcib_managed=true, config_id=iscsid, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Oct 01 14:25:04 compute-0 nova_compute[192698]: 2025-10-01 14:25:04.382 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:25:05 compute-0 nova_compute[192698]: 2025-10-01 14:25:05.509 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:25:09 compute-0 podman[224729]: 2025-10-01 14:25:09.170966576 +0000 UTC m=+0.085470130 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 01 14:25:09 compute-0 nova_compute[192698]: 2025-10-01 14:25:09.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:25:10 compute-0 nova_compute[192698]: 2025-10-01 14:25:10.512 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:25:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:25:14.285 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:25:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:25:14.285 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:25:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:25:14.285 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:25:14 compute-0 nova_compute[192698]: 2025-10-01 14:25:14.387 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:25:15 compute-0 nova_compute[192698]: 2025-10-01 14:25:15.515 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:25:19 compute-0 nova_compute[192698]: 2025-10-01 14:25:19.390 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:25:20 compute-0 podman[224756]: 2025-10-01 14:25:20.182939077 +0000 UTC m=+0.092254193 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0)
Oct 01 14:25:20 compute-0 podman[224757]: 2025-10-01 14:25:20.237620018 +0000 UTC m=+0.133877002 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Oct 01 14:25:20 compute-0 nova_compute[192698]: 2025-10-01 14:25:20.518 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:25:21 compute-0 unix_chkpwd[224802]: password check failed for user (root)
Oct 01 14:25:21 compute-0 sshd-session[224755]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.95.115  user=root
Oct 01 14:25:23 compute-0 sshd-session[224755]: Failed password for root from 80.94.95.115 port 34012 ssh2
Oct 01 14:25:23 compute-0 sshd-session[224755]: Connection closed by authenticating user root 80.94.95.115 port 34012 [preauth]
Oct 01 14:25:24 compute-0 nova_compute[192698]: 2025-10-01 14:25:24.391 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:25:24 compute-0 nova_compute[192698]: 2025-10-01 14:25:24.438 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:25:25 compute-0 nova_compute[192698]: 2025-10-01 14:25:25.521 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:25:25 compute-0 nova_compute[192698]: 2025-10-01 14:25:25.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:25:26 compute-0 nova_compute[192698]: 2025-10-01 14:25:26.439 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:25:26 compute-0 nova_compute[192698]: 2025-10-01 14:25:26.439 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:25:26 compute-0 nova_compute[192698]: 2025-10-01 14:25:26.440 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:25:26 compute-0 nova_compute[192698]: 2025-10-01 14:25:26.440 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 01 14:25:26 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:25:26.494 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'e2:3f:3c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:1d:a6:67:ed:e6'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:25:26 compute-0 nova_compute[192698]: 2025-10-01 14:25:26.495 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:25:26 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:25:26.495 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 01 14:25:26 compute-0 nova_compute[192698]: 2025-10-01 14:25:26.622 2 WARNING nova.virt.libvirt.driver [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:25:26 compute-0 nova_compute[192698]: 2025-10-01 14:25:26.624 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:25:26 compute-0 nova_compute[192698]: 2025-10-01 14:25:26.648 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.024s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:25:26 compute-0 nova_compute[192698]: 2025-10-01 14:25:26.649 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5841MB free_disk=73.3027572631836GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 01 14:25:26 compute-0 nova_compute[192698]: 2025-10-01 14:25:26.649 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:25:26 compute-0 nova_compute[192698]: 2025-10-01 14:25:26.649 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:25:27 compute-0 podman[224805]: 2025-10-01 14:25:27.170524731 +0000 UTC m=+0.078305588 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, name=ubi9-minimal, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, distribution-scope=public, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, release=1755695350, io.openshift.expose-services=)
Oct 01 14:25:27 compute-0 nova_compute[192698]: 2025-10-01 14:25:27.749 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 01 14:25:27 compute-0 nova_compute[192698]: 2025-10-01 14:25:27.750 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:25:26 up  1:24,  0 user,  load average: 0.34, 0.32, 0.33\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 01 14:25:27 compute-0 nova_compute[192698]: 2025-10-01 14:25:27.774 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:25:28 compute-0 nova_compute[192698]: 2025-10-01 14:25:28.283 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:25:28 compute-0 nova_compute[192698]: 2025-10-01 14:25:28.799 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 01 14:25:28 compute-0 nova_compute[192698]: 2025-10-01 14:25:28.800 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.151s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:25:29 compute-0 nova_compute[192698]: 2025-10-01 14:25:29.393 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:25:29 compute-0 podman[203144]: time="2025-10-01T14:25:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:25:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:25:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:25:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:25:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3026 "" "Go-http-client/1.1"
Oct 01 14:25:29 compute-0 nova_compute[192698]: 2025-10-01 14:25:29.800 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:25:30 compute-0 nova_compute[192698]: 2025-10-01 14:25:30.311 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:25:30 compute-0 nova_compute[192698]: 2025-10-01 14:25:30.311 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:25:30 compute-0 nova_compute[192698]: 2025-10-01 14:25:30.312 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:25:30 compute-0 nova_compute[192698]: 2025-10-01 14:25:30.524 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:25:31 compute-0 openstack_network_exporter[205307]: ERROR   14:25:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:25:31 compute-0 openstack_network_exporter[205307]: ERROR   14:25:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:25:31 compute-0 openstack_network_exporter[205307]: ERROR   14:25:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:25:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:25:31 compute-0 openstack_network_exporter[205307]: ERROR   14:25:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:25:31 compute-0 openstack_network_exporter[205307]: ERROR   14:25:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:25:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:25:31 compute-0 nova_compute[192698]: 2025-10-01 14:25:31.426 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:25:32 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:25:32.496 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=10cf9814-09fa-4bad-879a-270f9b64eda3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:25:33 compute-0 nova_compute[192698]: 2025-10-01 14:25:33.924 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:25:33 compute-0 nova_compute[192698]: 2025-10-01 14:25:33.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:25:33 compute-0 nova_compute[192698]: 2025-10-01 14:25:33.925 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 01 14:25:34 compute-0 podman[224828]: 2025-10-01 14:25:34.159916494 +0000 UTC m=+0.073129348 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd)
Oct 01 14:25:34 compute-0 podman[224827]: 2025-10-01 14:25:34.186439788 +0000 UTC m=+0.092763467 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20250930)
Oct 01 14:25:34 compute-0 nova_compute[192698]: 2025-10-01 14:25:34.395 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:25:35 compute-0 nova_compute[192698]: 2025-10-01 14:25:35.583 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:25:39 compute-0 nova_compute[192698]: 2025-10-01 14:25:39.398 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:25:40 compute-0 podman[224867]: 2025-10-01 14:25:40.150487496 +0000 UTC m=+0.066568862 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 01 14:25:40 compute-0 nova_compute[192698]: 2025-10-01 14:25:40.586 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:25:44 compute-0 nova_compute[192698]: 2025-10-01 14:25:44.399 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:25:45 compute-0 nova_compute[192698]: 2025-10-01 14:25:45.589 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:25:49 compute-0 nova_compute[192698]: 2025-10-01 14:25:49.400 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:25:50 compute-0 nova_compute[192698]: 2025-10-01 14:25:50.592 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:25:51 compute-0 podman[224892]: 2025-10-01 14:25:51.175367983 +0000 UTC m=+0.080160577 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Oct 01 14:25:51 compute-0 podman[224893]: 2025-10-01 14:25:51.233779274 +0000 UTC m=+0.135709171 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20250930, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS)
Oct 01 14:25:54 compute-0 nova_compute[192698]: 2025-10-01 14:25:54.403 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:25:55 compute-0 nova_compute[192698]: 2025-10-01 14:25:55.595 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:25:56 compute-0 ovn_controller[94909]: 2025-10-01T14:25:56Z|00189|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Oct 01 14:25:58 compute-0 podman[224936]: 2025-10-01 14:25:58.186240483 +0000 UTC m=+0.069927873 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., version=9.6, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, architecture=x86_64, config_id=edpm, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct 01 14:25:58 compute-0 nova_compute[192698]: 2025-10-01 14:25:58.888 2 DEBUG nova.virt.libvirt.driver [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 31943dad-f83c-4e41-8ca4-baac8a025255] Creating tmpfile /var/lib/nova/instances/tmphc3o2ye5 to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Oct 01 14:25:58 compute-0 nova_compute[192698]: 2025-10-01 14:25:58.889 2 WARNING neutronclient.v2_0.client [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:25:58 compute-0 nova_compute[192698]: 2025-10-01 14:25:58.894 2 DEBUG nova.compute.manager [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphc3o2ye5',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Oct 01 14:25:58 compute-0 nova_compute[192698]: 2025-10-01 14:25:58.954 2 DEBUG nova.virt.libvirt.driver [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ab7f70c7-9986-446d-8723-bf3a97689ca5] Creating tmpfile /var/lib/nova/instances/tmp0klm1j17 to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Oct 01 14:25:58 compute-0 nova_compute[192698]: 2025-10-01 14:25:58.955 2 WARNING neutronclient.v2_0.client [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:25:58 compute-0 nova_compute[192698]: 2025-10-01 14:25:58.958 2 DEBUG nova.compute.manager [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp0klm1j17',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Oct 01 14:25:59 compute-0 nova_compute[192698]: 2025-10-01 14:25:59.405 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:25:59 compute-0 podman[203144]: time="2025-10-01T14:25:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:25:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:25:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:25:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:25:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3026 "" "Go-http-client/1.1"
Oct 01 14:26:00 compute-0 nova_compute[192698]: 2025-10-01 14:26:00.599 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:26:00 compute-0 nova_compute[192698]: 2025-10-01 14:26:00.945 2 WARNING neutronclient.v2_0.client [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:26:00 compute-0 nova_compute[192698]: 2025-10-01 14:26:00.990 2 WARNING neutronclient.v2_0.client [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:26:01 compute-0 openstack_network_exporter[205307]: ERROR   14:26:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:26:01 compute-0 openstack_network_exporter[205307]: ERROR   14:26:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:26:01 compute-0 openstack_network_exporter[205307]: ERROR   14:26:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:26:01 compute-0 openstack_network_exporter[205307]: ERROR   14:26:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:26:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:26:01 compute-0 openstack_network_exporter[205307]: ERROR   14:26:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:26:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:26:04 compute-0 nova_compute[192698]: 2025-10-01 14:26:04.408 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:26:05 compute-0 podman[224958]: 2025-10-01 14:26:05.198137571 +0000 UTC m=+0.098145291 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team)
Oct 01 14:26:05 compute-0 podman[224959]: 2025-10-01 14:26:05.198231124 +0000 UTC m=+0.098643725 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Oct 01 14:26:05 compute-0 nova_compute[192698]: 2025-10-01 14:26:05.602 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:26:06 compute-0 nova_compute[192698]: 2025-10-01 14:26:06.257 2 DEBUG nova.compute.manager [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphc3o2ye5',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='31943dad-f83c-4e41-8ca4-baac8a025255',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Oct 01 14:26:07 compute-0 nova_compute[192698]: 2025-10-01 14:26:07.275 2 DEBUG oslo_concurrency.lockutils [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "refresh_cache-31943dad-f83c-4e41-8ca4-baac8a025255" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 01 14:26:07 compute-0 nova_compute[192698]: 2025-10-01 14:26:07.276 2 DEBUG oslo_concurrency.lockutils [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquired lock "refresh_cache-31943dad-f83c-4e41-8ca4-baac8a025255" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 01 14:26:07 compute-0 nova_compute[192698]: 2025-10-01 14:26:07.276 2 DEBUG nova.network.neutron [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 31943dad-f83c-4e41-8ca4-baac8a025255] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 01 14:26:07 compute-0 nova_compute[192698]: 2025-10-01 14:26:07.785 2 WARNING neutronclient.v2_0.client [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:26:08 compute-0 nova_compute[192698]: 2025-10-01 14:26:08.511 2 WARNING neutronclient.v2_0.client [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:26:08 compute-0 nova_compute[192698]: 2025-10-01 14:26:08.709 2 DEBUG nova.network.neutron [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 31943dad-f83c-4e41-8ca4-baac8a025255] Updating instance_info_cache with network_info: [{"id": "ec55d7c2-a601-4f1d-9c15-3aad37d2db0c", "address": "fa:16:3e:e8:fc:97", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec55d7c2-a6", "ovs_interfaceid": "ec55d7c2-a601-4f1d-9c15-3aad37d2db0c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:26:09 compute-0 nova_compute[192698]: 2025-10-01 14:26:09.216 2 DEBUG oslo_concurrency.lockutils [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Releasing lock "refresh_cache-31943dad-f83c-4e41-8ca4-baac8a025255" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 01 14:26:09 compute-0 nova_compute[192698]: 2025-10-01 14:26:09.235 2 DEBUG nova.virt.libvirt.driver [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 31943dad-f83c-4e41-8ca4-baac8a025255] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphc3o2ye5',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='31943dad-f83c-4e41-8ca4-baac8a025255',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Oct 01 14:26:09 compute-0 nova_compute[192698]: 2025-10-01 14:26:09.236 2 DEBUG nova.virt.libvirt.driver [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 31943dad-f83c-4e41-8ca4-baac8a025255] Creating instance directory: /var/lib/nova/instances/31943dad-f83c-4e41-8ca4-baac8a025255 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Oct 01 14:26:09 compute-0 nova_compute[192698]: 2025-10-01 14:26:09.236 2 DEBUG nova.virt.libvirt.driver [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 31943dad-f83c-4e41-8ca4-baac8a025255] Creating disk.info with the contents: {'/var/lib/nova/instances/31943dad-f83c-4e41-8ca4-baac8a025255/disk': 'qcow2', '/var/lib/nova/instances/31943dad-f83c-4e41-8ca4-baac8a025255/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Oct 01 14:26:09 compute-0 nova_compute[192698]: 2025-10-01 14:26:09.236 2 DEBUG nova.virt.libvirt.driver [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 31943dad-f83c-4e41-8ca4-baac8a025255] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Oct 01 14:26:09 compute-0 nova_compute[192698]: 2025-10-01 14:26:09.237 2 DEBUG nova.objects.instance [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 31943dad-f83c-4e41-8ca4-baac8a025255 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:26:09 compute-0 nova_compute[192698]: 2025-10-01 14:26:09.412 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:26:09 compute-0 nova_compute[192698]: 2025-10-01 14:26:09.753 2 DEBUG oslo_utils.imageutils.format_inspector [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:26:09 compute-0 nova_compute[192698]: 2025-10-01 14:26:09.759 2 DEBUG oslo_utils.imageutils.format_inspector [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:26:09 compute-0 nova_compute[192698]: 2025-10-01 14:26:09.763 2 DEBUG oslo_concurrency.processutils [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:26:09 compute-0 nova_compute[192698]: 2025-10-01 14:26:09.855 2 DEBUG oslo_concurrency.processutils [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:26:09 compute-0 nova_compute[192698]: 2025-10-01 14:26:09.856 2 DEBUG oslo_concurrency.lockutils [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "f477473ce09fdc00484ca839f539813eb2fee546" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:26:09 compute-0 nova_compute[192698]: 2025-10-01 14:26:09.857 2 DEBUG oslo_concurrency.lockutils [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "f477473ce09fdc00484ca839f539813eb2fee546" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:26:09 compute-0 nova_compute[192698]: 2025-10-01 14:26:09.858 2 DEBUG oslo_utils.imageutils.format_inspector [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:26:09 compute-0 nova_compute[192698]: 2025-10-01 14:26:09.860 2 DEBUG oslo_utils.imageutils.format_inspector [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:26:09 compute-0 nova_compute[192698]: 2025-10-01 14:26:09.861 2 DEBUG oslo_concurrency.processutils [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:26:09 compute-0 nova_compute[192698]: 2025-10-01 14:26:09.916 2 DEBUG oslo_concurrency.processutils [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:26:09 compute-0 nova_compute[192698]: 2025-10-01 14:26:09.917 2 DEBUG oslo_concurrency.processutils [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546,backing_fmt=raw /var/lib/nova/instances/31943dad-f83c-4e41-8ca4-baac8a025255/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:26:10 compute-0 nova_compute[192698]: 2025-10-01 14:26:10.121 2 DEBUG oslo_concurrency.processutils [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546,backing_fmt=raw /var/lib/nova/instances/31943dad-f83c-4e41-8ca4-baac8a025255/disk 1073741824" returned: 0 in 0.203s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:26:10 compute-0 nova_compute[192698]: 2025-10-01 14:26:10.123 2 DEBUG oslo_concurrency.lockutils [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "f477473ce09fdc00484ca839f539813eb2fee546" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.266s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:26:10 compute-0 nova_compute[192698]: 2025-10-01 14:26:10.124 2 DEBUG oslo_concurrency.processutils [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:26:10 compute-0 nova_compute[192698]: 2025-10-01 14:26:10.214 2 DEBUG oslo_concurrency.processutils [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:26:10 compute-0 nova_compute[192698]: 2025-10-01 14:26:10.216 2 DEBUG nova.virt.disk.api [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Checking if we can resize image /var/lib/nova/instances/31943dad-f83c-4e41-8ca4-baac8a025255/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 01 14:26:10 compute-0 nova_compute[192698]: 2025-10-01 14:26:10.217 2 DEBUG oslo_concurrency.processutils [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/31943dad-f83c-4e41-8ca4-baac8a025255/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:26:10 compute-0 nova_compute[192698]: 2025-10-01 14:26:10.288 2 DEBUG oslo_concurrency.processutils [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/31943dad-f83c-4e41-8ca4-baac8a025255/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:26:10 compute-0 nova_compute[192698]: 2025-10-01 14:26:10.290 2 DEBUG nova.virt.disk.api [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Cannot resize image /var/lib/nova/instances/31943dad-f83c-4e41-8ca4-baac8a025255/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 01 14:26:10 compute-0 nova_compute[192698]: 2025-10-01 14:26:10.291 2 DEBUG nova.objects.instance [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lazy-loading 'migration_context' on Instance uuid 31943dad-f83c-4e41-8ca4-baac8a025255 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:26:10 compute-0 nova_compute[192698]: 2025-10-01 14:26:10.630 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:26:10 compute-0 nova_compute[192698]: 2025-10-01 14:26:10.803 2 DEBUG nova.objects.base [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Object Instance<31943dad-f83c-4e41-8ca4-baac8a025255> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Oct 01 14:26:10 compute-0 nova_compute[192698]: 2025-10-01 14:26:10.804 2 DEBUG oslo_concurrency.processutils [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/31943dad-f83c-4e41-8ca4-baac8a025255/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:26:10 compute-0 nova_compute[192698]: 2025-10-01 14:26:10.849 2 DEBUG oslo_concurrency.processutils [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/31943dad-f83c-4e41-8ca4-baac8a025255/disk.config 497664" returned: 0 in 0.044s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:26:10 compute-0 nova_compute[192698]: 2025-10-01 14:26:10.850 2 DEBUG nova.virt.libvirt.driver [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 31943dad-f83c-4e41-8ca4-baac8a025255] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Oct 01 14:26:10 compute-0 nova_compute[192698]: 2025-10-01 14:26:10.853 2 DEBUG nova.virt.libvirt.vif [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-10-01T14:24:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-776027965',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-776027965',id=24,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-01T14:25:08Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d43115e3729442e1b68b749acc0dabc8',ramdisk_id='',reservation_id='r-8whbp6r7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-30131345',owner_user_name='tempest-TestExecuteStrategies-30131345-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-10-01T14:25:08Z,user_data=None,user_id='f8897741e6ca4770b56d28d05fa3fc42',uuid=31943dad-f83c-4e41-8ca4-baac8a025255,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ec55d7c2-a601-4f1d-9c15-3aad37d2db0c", "address": "fa:16:3e:e8:fc:97", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapec55d7c2-a6", "ovs_interfaceid": "ec55d7c2-a601-4f1d-9c15-3aad37d2db0c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 01 14:26:10 compute-0 nova_compute[192698]: 2025-10-01 14:26:10.854 2 DEBUG nova.network.os_vif_util [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Converting VIF {"id": "ec55d7c2-a601-4f1d-9c15-3aad37d2db0c", "address": "fa:16:3e:e8:fc:97", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapec55d7c2-a6", "ovs_interfaceid": "ec55d7c2-a601-4f1d-9c15-3aad37d2db0c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:26:10 compute-0 nova_compute[192698]: 2025-10-01 14:26:10.856 2 DEBUG nova.network.os_vif_util [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e8:fc:97,bridge_name='br-int',has_traffic_filtering=True,id=ec55d7c2-a601-4f1d-9c15-3aad37d2db0c,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec55d7c2-a6') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:26:10 compute-0 nova_compute[192698]: 2025-10-01 14:26:10.856 2 DEBUG os_vif [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e8:fc:97,bridge_name='br-int',has_traffic_filtering=True,id=ec55d7c2-a601-4f1d-9c15-3aad37d2db0c,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec55d7c2-a6') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 01 14:26:10 compute-0 nova_compute[192698]: 2025-10-01 14:26:10.858 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:26:10 compute-0 nova_compute[192698]: 2025-10-01 14:26:10.858 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:26:10 compute-0 nova_compute[192698]: 2025-10-01 14:26:10.859 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:26:10 compute-0 nova_compute[192698]: 2025-10-01 14:26:10.861 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:26:10 compute-0 nova_compute[192698]: 2025-10-01 14:26:10.861 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '3dc2a254-52ca-598b-b79e-b66e509cdc42', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:26:10 compute-0 nova_compute[192698]: 2025-10-01 14:26:10.864 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:26:10 compute-0 nova_compute[192698]: 2025-10-01 14:26:10.865 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:26:10 compute-0 nova_compute[192698]: 2025-10-01 14:26:10.870 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:26:10 compute-0 nova_compute[192698]: 2025-10-01 14:26:10.871 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapec55d7c2-a6, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:26:10 compute-0 nova_compute[192698]: 2025-10-01 14:26:10.871 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapec55d7c2-a6, col_values=(('qos', UUID('012ac828-2d1c-43f6-a670-cadefd7763c0')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:26:10 compute-0 nova_compute[192698]: 2025-10-01 14:26:10.872 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapec55d7c2-a6, col_values=(('external_ids', {'iface-id': 'ec55d7c2-a601-4f1d-9c15-3aad37d2db0c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e8:fc:97', 'vm-uuid': '31943dad-f83c-4e41-8ca4-baac8a025255'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:26:10 compute-0 nova_compute[192698]: 2025-10-01 14:26:10.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:26:10 compute-0 NetworkManager[51741]: <info>  [1759328770.8766] manager: (tapec55d7c2-a6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/74)
Oct 01 14:26:10 compute-0 nova_compute[192698]: 2025-10-01 14:26:10.879 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:26:10 compute-0 nova_compute[192698]: 2025-10-01 14:26:10.884 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:26:10 compute-0 nova_compute[192698]: 2025-10-01 14:26:10.885 2 INFO os_vif [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e8:fc:97,bridge_name='br-int',has_traffic_filtering=True,id=ec55d7c2-a601-4f1d-9c15-3aad37d2db0c,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec55d7c2-a6')
Oct 01 14:26:10 compute-0 nova_compute[192698]: 2025-10-01 14:26:10.886 2 DEBUG nova.virt.libvirt.driver [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Oct 01 14:26:10 compute-0 nova_compute[192698]: 2025-10-01 14:26:10.887 2 DEBUG nova.compute.manager [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphc3o2ye5',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='31943dad-f83c-4e41-8ca4-baac8a025255',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Oct 01 14:26:10 compute-0 nova_compute[192698]: 2025-10-01 14:26:10.887 2 WARNING neutronclient.v2_0.client [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:26:11 compute-0 podman[225018]: 2025-10-01 14:26:11.182147268 +0000 UTC m=+0.085659316 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 01 14:26:11 compute-0 nova_compute[192698]: 2025-10-01 14:26:11.449 2 WARNING neutronclient.v2_0.client [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:26:12 compute-0 nova_compute[192698]: 2025-10-01 14:26:12.589 2 DEBUG nova.network.neutron [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 31943dad-f83c-4e41-8ca4-baac8a025255] Port ec55d7c2-a601-4f1d-9c15-3aad37d2db0c updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Oct 01 14:26:12 compute-0 nova_compute[192698]: 2025-10-01 14:26:12.608 2 DEBUG nova.compute.manager [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphc3o2ye5',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='31943dad-f83c-4e41-8ca4-baac8a025255',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Oct 01 14:26:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:26:14.286 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:26:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:26:14.287 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:26:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:26:14.287 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:26:14 compute-0 nova_compute[192698]: 2025-10-01 14:26:14.414 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:26:15 compute-0 systemd[1]: Starting libvirt proxy daemon...
Oct 01 14:26:15 compute-0 systemd[1]: Started libvirt proxy daemon.
Oct 01 14:26:15 compute-0 kernel: tapec55d7c2-a6: entered promiscuous mode
Oct 01 14:26:15 compute-0 NetworkManager[51741]: <info>  [1759328775.7002] manager: (tapec55d7c2-a6): new Tun device (/org/freedesktop/NetworkManager/Devices/75)
Oct 01 14:26:15 compute-0 ovn_controller[94909]: 2025-10-01T14:26:15Z|00190|binding|INFO|Claiming lport ec55d7c2-a601-4f1d-9c15-3aad37d2db0c for this additional chassis.
Oct 01 14:26:15 compute-0 ovn_controller[94909]: 2025-10-01T14:26:15Z|00191|binding|INFO|ec55d7c2-a601-4f1d-9c15-3aad37d2db0c: Claiming fa:16:3e:e8:fc:97 10.100.0.12
Oct 01 14:26:15 compute-0 nova_compute[192698]: 2025-10-01 14:26:15.705 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:26:15 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:26:15.715 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e8:fc:97 10.100.0.12'], port_security=['fa:16:3e:e8:fc:97 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '31943dad-f83c-4e41-8ca4-baac8a025255', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-031a8987-8430-4fb6-a464-01e4dca2fae7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd43115e3729442e1b68b749acc0dabc8', 'neutron:revision_number': '10', 'neutron:security_group_ids': '43a3232d-93b1-43af-a9a3-1fde49b4460d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bd1914da-f1b0-4097-9d6b-24a3870871dc, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=ec55d7c2-a601-4f1d-9c15-3aad37d2db0c) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:26:15 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:26:15.716 103791 INFO neutron.agent.ovn.metadata.agent [-] Port ec55d7c2-a601-4f1d-9c15-3aad37d2db0c in datapath 031a8987-8430-4fb6-a464-01e4dca2fae7 unbound from our chassis
Oct 01 14:26:15 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:26:15.718 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 031a8987-8430-4fb6-a464-01e4dca2fae7
Oct 01 14:26:15 compute-0 nova_compute[192698]: 2025-10-01 14:26:15.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:26:15 compute-0 ovn_controller[94909]: 2025-10-01T14:26:15Z|00192|binding|INFO|Setting lport ec55d7c2-a601-4f1d-9c15-3aad37d2db0c ovn-installed in OVS
Oct 01 14:26:15 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:26:15.732 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[f638e440-a009-4791-bebf-db080de3dfac]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:26:15 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:26:15.733 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap031a8987-81 in ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Oct 01 14:26:15 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:26:15.735 214114 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap031a8987-80 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Oct 01 14:26:15 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:26:15.735 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[aba9f4c1-165e-476b-a34a-790e50bec82d]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:26:15 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:26:15.737 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[8bbc3f72-35b4-4ffd-b7b0-3532ebf0d40f]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:26:15 compute-0 systemd-machined[152704]: New machine qemu-18-instance-00000018.
Oct 01 14:26:15 compute-0 systemd-udevd[225076]: Network interface NamePolicy= disabled on kernel command line.
Oct 01 14:26:15 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:26:15.751 103910 DEBUG oslo.privsep.daemon [-] privsep: reply[5b914646-bbfd-458b-afcf-7dd515356396]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:26:15 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:26:15.757 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[0389396f-733d-4706-9cbb-d73c0f89baeb]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:26:15 compute-0 NetworkManager[51741]: <info>  [1759328775.7623] device (tapec55d7c2-a6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 01 14:26:15 compute-0 NetworkManager[51741]: <info>  [1759328775.7634] device (tapec55d7c2-a6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 01 14:26:15 compute-0 systemd[1]: Started Virtual Machine qemu-18-instance-00000018.
Oct 01 14:26:15 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:26:15.796 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[4d857186-bffe-411e-96a8-4a80417f03d1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:26:15 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:26:15.802 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[0d17d19b-bd5a-4335-a35a-ba35508f053c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:26:15 compute-0 NetworkManager[51741]: <info>  [1759328775.8032] manager: (tap031a8987-80): new Veth device (/org/freedesktop/NetworkManager/Devices/76)
Oct 01 14:26:15 compute-0 systemd-udevd[225080]: Network interface NamePolicy= disabled on kernel command line.
Oct 01 14:26:15 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:26:15.839 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[8d3e0c15-b613-4927-832e-8e777db77470]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:26:15 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:26:15.842 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[70b32c64-c4a7-4066-bfa2-bcfc92a03239]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:26:15 compute-0 NetworkManager[51741]: <info>  [1759328775.8644] device (tap031a8987-80): carrier: link connected
Oct 01 14:26:15 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:26:15.870 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[f90da98f-2009-420c-9972-35e3af88d095]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:26:15 compute-0 nova_compute[192698]: 2025-10-01 14:26:15.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:26:15 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:26:15.889 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[22eb7069-93ad-4451-bc80-07de9bc77f69]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap031a8987-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:79:6c:81'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 57], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 513639, 'reachable_time': 27969, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225108, 'error': None, 'target': 'ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:26:15 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:26:15.905 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[c7865d1d-e7ef-4452-9b51-976d290a9d4b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe79:6c81'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 513639, 'tstamp': 513639}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225109, 'error': None, 'target': 'ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:26:15 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:26:15.925 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[cf25cf8c-0630-4dc5-a344-e2fa71172da4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap031a8987-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:79:6c:81'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 57], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 513639, 'reachable_time': 27969, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 225110, 'error': None, 'target': 'ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:26:15 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:26:15.966 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[6976fc32-2f34-404d-9866-7ba7a9052825]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:26:16 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:26:16.025 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[3a004857-2a0a-4660-8629-8f65681034fc]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:26:16 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:26:16.028 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap031a8987-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:26:16 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:26:16.029 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:26:16 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:26:16.029 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap031a8987-80, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:26:16 compute-0 kernel: tap031a8987-80: entered promiscuous mode
Oct 01 14:26:16 compute-0 NetworkManager[51741]: <info>  [1759328776.0336] manager: (tap031a8987-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/77)
Oct 01 14:26:16 compute-0 nova_compute[192698]: 2025-10-01 14:26:16.032 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:26:16 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:26:16.036 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap031a8987-80, col_values=(('external_ids', {'iface-id': '6dd814dc-cba2-4392-85ef-eadb8c4615f7'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:26:16 compute-0 ovn_controller[94909]: 2025-10-01T14:26:16Z|00193|binding|INFO|Releasing lport 6dd814dc-cba2-4392-85ef-eadb8c4615f7 from this chassis (sb_readonly=0)
Oct 01 14:26:16 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:26:16.039 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[4a8a258a-d237-483d-b065-e3054695a586]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:26:16 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:26:16.040 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:26:16 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:26:16.040 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:26:16 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:26:16.040 103791 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 031a8987-8430-4fb6-a464-01e4dca2fae7 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Oct 01 14:26:16 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:26:16.041 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:26:16 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:26:16.041 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[2277cf98-d0ef-48eb-b973-a375cef85454]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:26:16 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:26:16.042 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:26:16 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:26:16.042 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[1da6f46c-47ca-4e2a-843e-c0f311d91edb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:26:16 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:26:16.043 103791 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Oct 01 14:26:16 compute-0 ovn_metadata_agent[103777]: global
Oct 01 14:26:16 compute-0 ovn_metadata_agent[103777]:     log         /dev/log local0 debug
Oct 01 14:26:16 compute-0 ovn_metadata_agent[103777]:     log-tag     haproxy-metadata-proxy-031a8987-8430-4fb6-a464-01e4dca2fae7
Oct 01 14:26:16 compute-0 ovn_metadata_agent[103777]:     user        root
Oct 01 14:26:16 compute-0 ovn_metadata_agent[103777]:     group       root
Oct 01 14:26:16 compute-0 ovn_metadata_agent[103777]:     maxconn     1024
Oct 01 14:26:16 compute-0 ovn_metadata_agent[103777]:     pidfile     /var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy
Oct 01 14:26:16 compute-0 ovn_metadata_agent[103777]:     daemon
Oct 01 14:26:16 compute-0 ovn_metadata_agent[103777]: 
Oct 01 14:26:16 compute-0 ovn_metadata_agent[103777]: defaults
Oct 01 14:26:16 compute-0 ovn_metadata_agent[103777]:     log global
Oct 01 14:26:16 compute-0 ovn_metadata_agent[103777]:     mode http
Oct 01 14:26:16 compute-0 ovn_metadata_agent[103777]:     option httplog
Oct 01 14:26:16 compute-0 ovn_metadata_agent[103777]:     option dontlognull
Oct 01 14:26:16 compute-0 ovn_metadata_agent[103777]:     option http-server-close
Oct 01 14:26:16 compute-0 ovn_metadata_agent[103777]:     option forwardfor
Oct 01 14:26:16 compute-0 ovn_metadata_agent[103777]:     retries                 3
Oct 01 14:26:16 compute-0 ovn_metadata_agent[103777]:     timeout http-request    30s
Oct 01 14:26:16 compute-0 ovn_metadata_agent[103777]:     timeout connect         30s
Oct 01 14:26:16 compute-0 ovn_metadata_agent[103777]:     timeout client          32s
Oct 01 14:26:16 compute-0 ovn_metadata_agent[103777]:     timeout server          32s
Oct 01 14:26:16 compute-0 ovn_metadata_agent[103777]:     timeout http-keep-alive 30s
Oct 01 14:26:16 compute-0 ovn_metadata_agent[103777]: 
Oct 01 14:26:16 compute-0 ovn_metadata_agent[103777]: listen listener
Oct 01 14:26:16 compute-0 ovn_metadata_agent[103777]:     bind 169.254.169.254:80
Oct 01 14:26:16 compute-0 ovn_metadata_agent[103777]:     
Oct 01 14:26:16 compute-0 ovn_metadata_agent[103777]:     server metadata /var/lib/neutron/metadata_proxy
Oct 01 14:26:16 compute-0 ovn_metadata_agent[103777]: 
Oct 01 14:26:16 compute-0 ovn_metadata_agent[103777]:     http-request add-header X-OVN-Network-ID 031a8987-8430-4fb6-a464-01e4dca2fae7
Oct 01 14:26:16 compute-0 ovn_metadata_agent[103777]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Oct 01 14:26:16 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:26:16.044 103791 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7', 'env', 'PROCESS_TAG=haproxy-031a8987-8430-4fb6-a464-01e4dca2fae7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/031a8987-8430-4fb6-a464-01e4dca2fae7.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Oct 01 14:26:16 compute-0 nova_compute[192698]: 2025-10-01 14:26:16.049 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:26:16 compute-0 podman[225149]: 2025-10-01 14:26:16.44864773 +0000 UTC m=+0.025623330 image pull 0c139338a67144a0d88e07ef5f38b20d3085af4a1586fd8115d3776c8f9c633c 38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Oct 01 14:26:16 compute-0 podman[225149]: 2025-10-01 14:26:16.683398506 +0000 UTC m=+0.260374086 container create 08ef45f89e005a028e44ede7a2704b502cbd512faf534161220bb56c9eaebeb5 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, tcib_managed=true)
Oct 01 14:26:16 compute-0 systemd[1]: Started libpod-conmon-08ef45f89e005a028e44ede7a2704b502cbd512faf534161220bb56c9eaebeb5.scope.
Oct 01 14:26:16 compute-0 systemd[1]: Started libcrun container.
Oct 01 14:26:16 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5b55685413169f0fafa3eb1c92b5043ca617fe603e6aef931944109ccd87d9a0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 01 14:26:16 compute-0 podman[225149]: 2025-10-01 14:26:16.907231117 +0000 UTC m=+0.484206727 container init 08ef45f89e005a028e44ede7a2704b502cbd512faf534161220bb56c9eaebeb5 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.4)
Oct 01 14:26:16 compute-0 podman[225149]: 2025-10-01 14:26:16.921696667 +0000 UTC m=+0.498672227 container start 08ef45f89e005a028e44ede7a2704b502cbd512faf534161220bb56c9eaebeb5 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 01 14:26:16 compute-0 neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7[225164]: [NOTICE]   (225168) : New worker (225170) forked
Oct 01 14:26:16 compute-0 neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7[225164]: [NOTICE]   (225168) : Loading success.
Oct 01 14:26:18 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:26:18.556 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'e2:3f:3c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:1d:a6:67:ed:e6'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:26:18 compute-0 nova_compute[192698]: 2025-10-01 14:26:18.557 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:26:18 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:26:18.558 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 01 14:26:18 compute-0 ovn_controller[94909]: 2025-10-01T14:26:18Z|00194|binding|INFO|Claiming lport ec55d7c2-a601-4f1d-9c15-3aad37d2db0c for this chassis.
Oct 01 14:26:18 compute-0 ovn_controller[94909]: 2025-10-01T14:26:18Z|00195|binding|INFO|ec55d7c2-a601-4f1d-9c15-3aad37d2db0c: Claiming fa:16:3e:e8:fc:97 10.100.0.12
Oct 01 14:26:18 compute-0 ovn_controller[94909]: 2025-10-01T14:26:18Z|00196|binding|INFO|Setting lport ec55d7c2-a601-4f1d-9c15-3aad37d2db0c up in Southbound
Oct 01 14:26:19 compute-0 nova_compute[192698]: 2025-10-01 14:26:19.416 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:26:19 compute-0 nova_compute[192698]: 2025-10-01 14:26:19.705 2 INFO nova.compute.manager [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 31943dad-f83c-4e41-8ca4-baac8a025255] Post operation of migration started
Oct 01 14:26:19 compute-0 nova_compute[192698]: 2025-10-01 14:26:19.706 2 WARNING neutronclient.v2_0.client [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:26:20 compute-0 nova_compute[192698]: 2025-10-01 14:26:20.454 2 WARNING neutronclient.v2_0.client [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:26:20 compute-0 nova_compute[192698]: 2025-10-01 14:26:20.455 2 WARNING neutronclient.v2_0.client [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:26:20 compute-0 nova_compute[192698]: 2025-10-01 14:26:20.554 2 DEBUG oslo_concurrency.lockutils [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "refresh_cache-31943dad-f83c-4e41-8ca4-baac8a025255" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 01 14:26:20 compute-0 nova_compute[192698]: 2025-10-01 14:26:20.555 2 DEBUG oslo_concurrency.lockutils [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquired lock "refresh_cache-31943dad-f83c-4e41-8ca4-baac8a025255" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 01 14:26:20 compute-0 nova_compute[192698]: 2025-10-01 14:26:20.555 2 DEBUG nova.network.neutron [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 31943dad-f83c-4e41-8ca4-baac8a025255] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 01 14:26:20 compute-0 nova_compute[192698]: 2025-10-01 14:26:20.875 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:26:21 compute-0 nova_compute[192698]: 2025-10-01 14:26:21.061 2 WARNING neutronclient.v2_0.client [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:26:21 compute-0 nova_compute[192698]: 2025-10-01 14:26:21.907 2 WARNING neutronclient.v2_0.client [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:26:22 compute-0 nova_compute[192698]: 2025-10-01 14:26:22.047 2 DEBUG nova.network.neutron [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 31943dad-f83c-4e41-8ca4-baac8a025255] Updating instance_info_cache with network_info: [{"id": "ec55d7c2-a601-4f1d-9c15-3aad37d2db0c", "address": "fa:16:3e:e8:fc:97", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec55d7c2-a6", "ovs_interfaceid": "ec55d7c2-a601-4f1d-9c15-3aad37d2db0c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:26:22 compute-0 podman[225194]: 2025-10-01 14:26:22.158485719 +0000 UTC m=+0.072743548 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 01 14:26:22 compute-0 podman[225195]: 2025-10-01 14:26:22.206976504 +0000 UTC m=+0.118762516 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 01 14:26:22 compute-0 nova_compute[192698]: 2025-10-01 14:26:22.557 2 DEBUG oslo_concurrency.lockutils [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Releasing lock "refresh_cache-31943dad-f83c-4e41-8ca4-baac8a025255" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 01 14:26:22 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:26:22.561 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=10cf9814-09fa-4bad-879a-270f9b64eda3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:26:23 compute-0 nova_compute[192698]: 2025-10-01 14:26:23.077 2 DEBUG oslo_concurrency.lockutils [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:26:23 compute-0 nova_compute[192698]: 2025-10-01 14:26:23.077 2 DEBUG oslo_concurrency.lockutils [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:26:23 compute-0 nova_compute[192698]: 2025-10-01 14:26:23.078 2 DEBUG oslo_concurrency.lockutils [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:26:23 compute-0 nova_compute[192698]: 2025-10-01 14:26:23.083 2 INFO nova.virt.libvirt.driver [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 31943dad-f83c-4e41-8ca4-baac8a025255] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Oct 01 14:26:23 compute-0 virtqemud[192597]: Domain id=18 name='instance-00000018' uuid=31943dad-f83c-4e41-8ca4-baac8a025255 is tainted: custom-monitor
Oct 01 14:26:23 compute-0 nova_compute[192698]: 2025-10-01 14:26:23.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:26:24 compute-0 nova_compute[192698]: 2025-10-01 14:26:24.090 2 INFO nova.virt.libvirt.driver [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 31943dad-f83c-4e41-8ca4-baac8a025255] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Oct 01 14:26:24 compute-0 nova_compute[192698]: 2025-10-01 14:26:24.418 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:26:25 compute-0 nova_compute[192698]: 2025-10-01 14:26:25.097 2 INFO nova.virt.libvirt.driver [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 31943dad-f83c-4e41-8ca4-baac8a025255] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Oct 01 14:26:25 compute-0 nova_compute[192698]: 2025-10-01 14:26:25.104 2 DEBUG nova.compute.manager [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 31943dad-f83c-4e41-8ca4-baac8a025255] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 01 14:26:25 compute-0 nova_compute[192698]: 2025-10-01 14:26:25.619 2 DEBUG nova.objects.instance [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 31943dad-f83c-4e41-8ca4-baac8a025255] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Oct 01 14:26:25 compute-0 nova_compute[192698]: 2025-10-01 14:26:25.878 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:26:25 compute-0 nova_compute[192698]: 2025-10-01 14:26:25.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:26:26 compute-0 nova_compute[192698]: 2025-10-01 14:26:26.441 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:26:26 compute-0 nova_compute[192698]: 2025-10-01 14:26:26.441 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:26:26 compute-0 nova_compute[192698]: 2025-10-01 14:26:26.442 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:26:26 compute-0 nova_compute[192698]: 2025-10-01 14:26:26.442 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 01 14:26:26 compute-0 nova_compute[192698]: 2025-10-01 14:26:26.639 2 WARNING neutronclient.v2_0.client [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:26:26 compute-0 nova_compute[192698]: 2025-10-01 14:26:26.728 2 WARNING neutronclient.v2_0.client [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:26:26 compute-0 nova_compute[192698]: 2025-10-01 14:26:26.730 2 WARNING neutronclient.v2_0.client [None req-59b35733-3f39-4de4-b317-97c129e5c39e a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:26:27 compute-0 nova_compute[192698]: 2025-10-01 14:26:27.492 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/31943dad-f83c-4e41-8ca4-baac8a025255/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:26:27 compute-0 nova_compute[192698]: 2025-10-01 14:26:27.586 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/31943dad-f83c-4e41-8ca4-baac8a025255/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:26:27 compute-0 nova_compute[192698]: 2025-10-01 14:26:27.587 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/31943dad-f83c-4e41-8ca4-baac8a025255/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:26:27 compute-0 nova_compute[192698]: 2025-10-01 14:26:27.676 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/31943dad-f83c-4e41-8ca4-baac8a025255/disk --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:26:27 compute-0 nova_compute[192698]: 2025-10-01 14:26:27.897 2 WARNING nova.virt.libvirt.driver [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:26:27 compute-0 nova_compute[192698]: 2025-10-01 14:26:27.900 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:26:27 compute-0 nova_compute[192698]: 2025-10-01 14:26:27.951 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:26:27 compute-0 nova_compute[192698]: 2025-10-01 14:26:27.952 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5660MB free_disk=73.27376556396484GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 01 14:26:27 compute-0 nova_compute[192698]: 2025-10-01 14:26:27.953 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:26:27 compute-0 nova_compute[192698]: 2025-10-01 14:26:27.953 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:26:28 compute-0 nova_compute[192698]: 2025-10-01 14:26:28.978 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Applying migration context for instance 31943dad-f83c-4e41-8ca4-baac8a025255 as it has an incoming, in-progress migration 7b7e8807-4c15-42f2-96a8-5501ce714668. Migration status is running _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1046
Oct 01 14:26:28 compute-0 nova_compute[192698]: 2025-10-01 14:26:28.979 2 DEBUG nova.objects.instance [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] [instance: 31943dad-f83c-4e41-8ca4-baac8a025255] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Oct 01 14:26:29 compute-0 podman[225249]: 2025-10-01 14:26:29.151679563 +0000 UTC m=+0.067082386 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, version=9.6, managed_by=edpm_ansible, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., release=1755695350, io.buildah.version=1.33.7, vcs-type=git, architecture=x86_64, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct 01 14:26:29 compute-0 nova_compute[192698]: 2025-10-01 14:26:29.421 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:26:29 compute-0 nova_compute[192698]: 2025-10-01 14:26:29.487 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Migration for instance ab7f70c7-9986-446d-8723-bf3a97689ca5 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Oct 01 14:26:29 compute-0 nova_compute[192698]: 2025-10-01 14:26:29.488 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] [instance: 31943dad-f83c-4e41-8ca4-baac8a025255] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Oct 01 14:26:29 compute-0 podman[203144]: time="2025-10-01T14:26:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:26:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:26:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20750 "" "Go-http-client/1.1"
Oct 01 14:26:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:26:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3493 "" "Go-http-client/1.1"
Oct 01 14:26:29 compute-0 nova_compute[192698]: 2025-10-01 14:26:29.997 2 INFO nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] [instance: ab7f70c7-9986-446d-8723-bf3a97689ca5] Updating resource usage from migration 7babf08d-001c-4775-a8a6-c0840ef32b0c
Oct 01 14:26:29 compute-0 nova_compute[192698]: 2025-10-01 14:26:29.997 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] [instance: ab7f70c7-9986-446d-8723-bf3a97689ca5] Starting to track incoming migration 7babf08d-001c-4775-a8a6-c0840ef32b0c with flavor 69702c4b-38f2-49d1-96d5-85671652c67e _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Oct 01 14:26:30 compute-0 nova_compute[192698]: 2025-10-01 14:26:30.881 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:26:31 compute-0 nova_compute[192698]: 2025-10-01 14:26:31.095 2 WARNING nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Instance ab7f70c7-9986-446d-8723-bf3a97689ca5 has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Oct 01 14:26:31 compute-0 nova_compute[192698]: 2025-10-01 14:26:31.096 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Instance 31943dad-f83c-4e41-8ca4-baac8a025255 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Oct 01 14:26:31 compute-0 nova_compute[192698]: 2025-10-01 14:26:31.096 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 01 14:26:31 compute-0 nova_compute[192698]: 2025-10-01 14:26:31.096 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:26:27 up  1:25,  0 user,  load average: 0.12, 0.26, 0.31\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_d43115e3729442e1b68b749acc0dabc8': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 01 14:26:31 compute-0 nova_compute[192698]: 2025-10-01 14:26:31.239 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:26:31 compute-0 openstack_network_exporter[205307]: ERROR   14:26:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:26:31 compute-0 openstack_network_exporter[205307]: ERROR   14:26:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:26:31 compute-0 openstack_network_exporter[205307]: ERROR   14:26:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:26:31 compute-0 openstack_network_exporter[205307]: ERROR   14:26:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:26:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:26:31 compute-0 openstack_network_exporter[205307]: ERROR   14:26:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:26:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:26:31 compute-0 nova_compute[192698]: 2025-10-01 14:26:31.747 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:26:32 compute-0 nova_compute[192698]: 2025-10-01 14:26:32.261 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 01 14:26:32 compute-0 nova_compute[192698]: 2025-10-01 14:26:32.262 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 4.309s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:26:34 compute-0 nova_compute[192698]: 2025-10-01 14:26:34.262 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:26:34 compute-0 nova_compute[192698]: 2025-10-01 14:26:34.262 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:26:34 compute-0 nova_compute[192698]: 2025-10-01 14:26:34.263 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:26:34 compute-0 nova_compute[192698]: 2025-10-01 14:26:34.263 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:26:34 compute-0 nova_compute[192698]: 2025-10-01 14:26:34.263 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:26:34 compute-0 nova_compute[192698]: 2025-10-01 14:26:34.423 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:26:34 compute-0 nova_compute[192698]: 2025-10-01 14:26:34.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:26:34 compute-0 nova_compute[192698]: 2025-10-01 14:26:34.925 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 01 14:26:35 compute-0 nova_compute[192698]: 2025-10-01 14:26:35.661 2 DEBUG nova.compute.manager [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp0klm1j17',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='ab7f70c7-9986-446d-8723-bf3a97689ca5',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Oct 01 14:26:35 compute-0 nova_compute[192698]: 2025-10-01 14:26:35.883 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:26:36 compute-0 podman[225272]: 2025-10-01 14:26:36.191889893 +0000 UTC m=+0.091392290 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250930, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 01 14:26:36 compute-0 podman[225271]: 2025-10-01 14:26:36.206224439 +0000 UTC m=+0.109003944 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, container_name=iscsid, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=iscsid, org.label-schema.vendor=CentOS)
Oct 01 14:26:36 compute-0 nova_compute[192698]: 2025-10-01 14:26:36.677 2 DEBUG oslo_concurrency.lockutils [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "refresh_cache-ab7f70c7-9986-446d-8723-bf3a97689ca5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 01 14:26:36 compute-0 nova_compute[192698]: 2025-10-01 14:26:36.677 2 DEBUG oslo_concurrency.lockutils [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquired lock "refresh_cache-ab7f70c7-9986-446d-8723-bf3a97689ca5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 01 14:26:36 compute-0 nova_compute[192698]: 2025-10-01 14:26:36.677 2 DEBUG nova.network.neutron [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ab7f70c7-9986-446d-8723-bf3a97689ca5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 01 14:26:37 compute-0 nova_compute[192698]: 2025-10-01 14:26:37.184 2 WARNING neutronclient.v2_0.client [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:26:39 compute-0 nova_compute[192698]: 2025-10-01 14:26:39.291 2 WARNING neutronclient.v2_0.client [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:26:39 compute-0 nova_compute[192698]: 2025-10-01 14:26:39.427 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:26:39 compute-0 nova_compute[192698]: 2025-10-01 14:26:39.495 2 DEBUG nova.network.neutron [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ab7f70c7-9986-446d-8723-bf3a97689ca5] Updating instance_info_cache with network_info: [{"id": "4b5e6d9d-90b3-4892-a404-bdd62fd30b6e", "address": "fa:16:3e:03:04:f6", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b5e6d9d-90", "ovs_interfaceid": "4b5e6d9d-90b3-4892-a404-bdd62fd30b6e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:26:40 compute-0 nova_compute[192698]: 2025-10-01 14:26:40.003 2 DEBUG oslo_concurrency.lockutils [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Releasing lock "refresh_cache-ab7f70c7-9986-446d-8723-bf3a97689ca5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 01 14:26:40 compute-0 nova_compute[192698]: 2025-10-01 14:26:40.020 2 DEBUG nova.virt.libvirt.driver [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ab7f70c7-9986-446d-8723-bf3a97689ca5] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp0klm1j17',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='ab7f70c7-9986-446d-8723-bf3a97689ca5',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Oct 01 14:26:40 compute-0 nova_compute[192698]: 2025-10-01 14:26:40.021 2 DEBUG nova.virt.libvirt.driver [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ab7f70c7-9986-446d-8723-bf3a97689ca5] Creating instance directory: /var/lib/nova/instances/ab7f70c7-9986-446d-8723-bf3a97689ca5 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Oct 01 14:26:40 compute-0 nova_compute[192698]: 2025-10-01 14:26:40.022 2 DEBUG nova.virt.libvirt.driver [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ab7f70c7-9986-446d-8723-bf3a97689ca5] Creating disk.info with the contents: {'/var/lib/nova/instances/ab7f70c7-9986-446d-8723-bf3a97689ca5/disk': 'qcow2', '/var/lib/nova/instances/ab7f70c7-9986-446d-8723-bf3a97689ca5/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Oct 01 14:26:40 compute-0 nova_compute[192698]: 2025-10-01 14:26:40.022 2 DEBUG nova.virt.libvirt.driver [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ab7f70c7-9986-446d-8723-bf3a97689ca5] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Oct 01 14:26:40 compute-0 nova_compute[192698]: 2025-10-01 14:26:40.023 2 DEBUG nova.objects.instance [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lazy-loading 'trusted_certs' on Instance uuid ab7f70c7-9986-446d-8723-bf3a97689ca5 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:26:40 compute-0 nova_compute[192698]: 2025-10-01 14:26:40.530 2 DEBUG oslo_utils.imageutils.format_inspector [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:26:40 compute-0 nova_compute[192698]: 2025-10-01 14:26:40.537 2 DEBUG oslo_utils.imageutils.format_inspector [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:26:40 compute-0 nova_compute[192698]: 2025-10-01 14:26:40.539 2 DEBUG oslo_concurrency.processutils [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:26:40 compute-0 nova_compute[192698]: 2025-10-01 14:26:40.614 2 DEBUG oslo_concurrency.processutils [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:26:40 compute-0 nova_compute[192698]: 2025-10-01 14:26:40.616 2 DEBUG oslo_concurrency.lockutils [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "f477473ce09fdc00484ca839f539813eb2fee546" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:26:40 compute-0 nova_compute[192698]: 2025-10-01 14:26:40.617 2 DEBUG oslo_concurrency.lockutils [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "f477473ce09fdc00484ca839f539813eb2fee546" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:26:40 compute-0 nova_compute[192698]: 2025-10-01 14:26:40.618 2 DEBUG oslo_utils.imageutils.format_inspector [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:26:40 compute-0 nova_compute[192698]: 2025-10-01 14:26:40.625 2 DEBUG oslo_utils.imageutils.format_inspector [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:26:40 compute-0 nova_compute[192698]: 2025-10-01 14:26:40.625 2 DEBUG oslo_concurrency.processutils [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:26:40 compute-0 nova_compute[192698]: 2025-10-01 14:26:40.694 2 DEBUG oslo_concurrency.processutils [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:26:40 compute-0 nova_compute[192698]: 2025-10-01 14:26:40.696 2 DEBUG oslo_concurrency.processutils [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546,backing_fmt=raw /var/lib/nova/instances/ab7f70c7-9986-446d-8723-bf3a97689ca5/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:26:40 compute-0 nova_compute[192698]: 2025-10-01 14:26:40.806 2 DEBUG oslo_concurrency.processutils [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546,backing_fmt=raw /var/lib/nova/instances/ab7f70c7-9986-446d-8723-bf3a97689ca5/disk 1073741824" returned: 0 in 0.110s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:26:40 compute-0 nova_compute[192698]: 2025-10-01 14:26:40.807 2 DEBUG oslo_concurrency.lockutils [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "f477473ce09fdc00484ca839f539813eb2fee546" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.191s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:26:40 compute-0 nova_compute[192698]: 2025-10-01 14:26:40.808 2 DEBUG oslo_concurrency.processutils [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:26:40 compute-0 nova_compute[192698]: 2025-10-01 14:26:40.870 2 DEBUG oslo_concurrency.processutils [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:26:40 compute-0 nova_compute[192698]: 2025-10-01 14:26:40.871 2 DEBUG nova.virt.disk.api [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Checking if we can resize image /var/lib/nova/instances/ab7f70c7-9986-446d-8723-bf3a97689ca5/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 01 14:26:40 compute-0 nova_compute[192698]: 2025-10-01 14:26:40.872 2 DEBUG oslo_concurrency.processutils [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ab7f70c7-9986-446d-8723-bf3a97689ca5/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:26:40 compute-0 nova_compute[192698]: 2025-10-01 14:26:40.885 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:26:40 compute-0 nova_compute[192698]: 2025-10-01 14:26:40.934 2 DEBUG oslo_concurrency.processutils [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ab7f70c7-9986-446d-8723-bf3a97689ca5/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:26:40 compute-0 nova_compute[192698]: 2025-10-01 14:26:40.935 2 DEBUG nova.virt.disk.api [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Cannot resize image /var/lib/nova/instances/ab7f70c7-9986-446d-8723-bf3a97689ca5/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 01 14:26:40 compute-0 nova_compute[192698]: 2025-10-01 14:26:40.935 2 DEBUG nova.objects.instance [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lazy-loading 'migration_context' on Instance uuid ab7f70c7-9986-446d-8723-bf3a97689ca5 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:26:41 compute-0 nova_compute[192698]: 2025-10-01 14:26:41.443 2 DEBUG nova.objects.base [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Object Instance<ab7f70c7-9986-446d-8723-bf3a97689ca5> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Oct 01 14:26:41 compute-0 nova_compute[192698]: 2025-10-01 14:26:41.445 2 DEBUG oslo_concurrency.processutils [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/ab7f70c7-9986-446d-8723-bf3a97689ca5/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:26:41 compute-0 nova_compute[192698]: 2025-10-01 14:26:41.475 2 DEBUG oslo_concurrency.processutils [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/ab7f70c7-9986-446d-8723-bf3a97689ca5/disk.config 497664" returned: 0 in 0.030s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:26:41 compute-0 nova_compute[192698]: 2025-10-01 14:26:41.477 2 DEBUG nova.virt.libvirt.driver [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ab7f70c7-9986-446d-8723-bf3a97689ca5] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Oct 01 14:26:41 compute-0 nova_compute[192698]: 2025-10-01 14:26:41.479 2 DEBUG nova.virt.libvirt.vif [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-10-01T14:25:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-879607803',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-879607803',id=25,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-01T14:25:28Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d43115e3729442e1b68b749acc0dabc8',ramdisk_id='',reservation_id='r-ypwof23w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-30131345',owner_user_name='tempest-TestExecuteStrategies-30131345-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-10-01T14:25:28Z,user_data=None,user_id='f8897741e6ca4770b56d28d05fa3fc42',uuid=ab7f70c7-9986-446d-8723-bf3a97689ca5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4b5e6d9d-90b3-4892-a404-bdd62fd30b6e", "address": "fa:16:3e:03:04:f6", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap4b5e6d9d-90", "ovs_interfaceid": "4b5e6d9d-90b3-4892-a404-bdd62fd30b6e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 01 14:26:41 compute-0 nova_compute[192698]: 2025-10-01 14:26:41.479 2 DEBUG nova.network.os_vif_util [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Converting VIF {"id": "4b5e6d9d-90b3-4892-a404-bdd62fd30b6e", "address": "fa:16:3e:03:04:f6", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap4b5e6d9d-90", "ovs_interfaceid": "4b5e6d9d-90b3-4892-a404-bdd62fd30b6e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:26:41 compute-0 nova_compute[192698]: 2025-10-01 14:26:41.481 2 DEBUG nova.network.os_vif_util [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:03:04:f6,bridge_name='br-int',has_traffic_filtering=True,id=4b5e6d9d-90b3-4892-a404-bdd62fd30b6e,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b5e6d9d-90') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:26:41 compute-0 nova_compute[192698]: 2025-10-01 14:26:41.482 2 DEBUG os_vif [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:03:04:f6,bridge_name='br-int',has_traffic_filtering=True,id=4b5e6d9d-90b3-4892-a404-bdd62fd30b6e,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b5e6d9d-90') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 01 14:26:41 compute-0 nova_compute[192698]: 2025-10-01 14:26:41.483 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:26:41 compute-0 nova_compute[192698]: 2025-10-01 14:26:41.484 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:26:41 compute-0 nova_compute[192698]: 2025-10-01 14:26:41.484 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:26:41 compute-0 nova_compute[192698]: 2025-10-01 14:26:41.486 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:26:41 compute-0 nova_compute[192698]: 2025-10-01 14:26:41.486 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'a03b5d75-cbd2-5c1a-9bda-0331f397ced4', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:26:41 compute-0 nova_compute[192698]: 2025-10-01 14:26:41.488 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:26:41 compute-0 nova_compute[192698]: 2025-10-01 14:26:41.491 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:26:41 compute-0 nova_compute[192698]: 2025-10-01 14:26:41.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:26:41 compute-0 nova_compute[192698]: 2025-10-01 14:26:41.495 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4b5e6d9d-90, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:26:41 compute-0 nova_compute[192698]: 2025-10-01 14:26:41.496 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap4b5e6d9d-90, col_values=(('qos', UUID('1faf2fa3-9a3d-4471-a5b2-189e69b445e0')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:26:41 compute-0 nova_compute[192698]: 2025-10-01 14:26:41.496 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap4b5e6d9d-90, col_values=(('external_ids', {'iface-id': '4b5e6d9d-90b3-4892-a404-bdd62fd30b6e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:03:04:f6', 'vm-uuid': 'ab7f70c7-9986-446d-8723-bf3a97689ca5'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:26:41 compute-0 nova_compute[192698]: 2025-10-01 14:26:41.498 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:26:41 compute-0 NetworkManager[51741]: <info>  [1759328801.4994] manager: (tap4b5e6d9d-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/78)
Oct 01 14:26:41 compute-0 nova_compute[192698]: 2025-10-01 14:26:41.501 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:26:41 compute-0 nova_compute[192698]: 2025-10-01 14:26:41.507 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:26:41 compute-0 nova_compute[192698]: 2025-10-01 14:26:41.508 2 INFO os_vif [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:03:04:f6,bridge_name='br-int',has_traffic_filtering=True,id=4b5e6d9d-90b3-4892-a404-bdd62fd30b6e,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b5e6d9d-90')
Oct 01 14:26:41 compute-0 nova_compute[192698]: 2025-10-01 14:26:41.509 2 DEBUG nova.virt.libvirt.driver [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Oct 01 14:26:41 compute-0 nova_compute[192698]: 2025-10-01 14:26:41.510 2 DEBUG nova.compute.manager [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp0klm1j17',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='ab7f70c7-9986-446d-8723-bf3a97689ca5',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Oct 01 14:26:41 compute-0 nova_compute[192698]: 2025-10-01 14:26:41.511 2 WARNING neutronclient.v2_0.client [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:26:41 compute-0 nova_compute[192698]: 2025-10-01 14:26:41.638 2 WARNING neutronclient.v2_0.client [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:26:42 compute-0 podman[225331]: 2025-10-01 14:26:42.178414895 +0000 UTC m=+0.080619020 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 01 14:26:43 compute-0 nova_compute[192698]: 2025-10-01 14:26:43.674 2 DEBUG nova.network.neutron [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ab7f70c7-9986-446d-8723-bf3a97689ca5] Port 4b5e6d9d-90b3-4892-a404-bdd62fd30b6e updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Oct 01 14:26:43 compute-0 nova_compute[192698]: 2025-10-01 14:26:43.693 2 DEBUG nova.compute.manager [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp0klm1j17',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='ab7f70c7-9986-446d-8723-bf3a97689ca5',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Oct 01 14:26:43 compute-0 sshd-session[225355]: Connection closed by 101.47.181.100 port 37140
Oct 01 14:26:44 compute-0 nova_compute[192698]: 2025-10-01 14:26:44.428 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:26:46 compute-0 nova_compute[192698]: 2025-10-01 14:26:46.498 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:26:46 compute-0 kernel: tap4b5e6d9d-90: entered promiscuous mode
Oct 01 14:26:46 compute-0 NetworkManager[51741]: <info>  [1759328806.5048] manager: (tap4b5e6d9d-90): new Tun device (/org/freedesktop/NetworkManager/Devices/79)
Oct 01 14:26:46 compute-0 nova_compute[192698]: 2025-10-01 14:26:46.506 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:26:46 compute-0 ovn_controller[94909]: 2025-10-01T14:26:46Z|00197|binding|INFO|Claiming lport 4b5e6d9d-90b3-4892-a404-bdd62fd30b6e for this additional chassis.
Oct 01 14:26:46 compute-0 ovn_controller[94909]: 2025-10-01T14:26:46Z|00198|binding|INFO|4b5e6d9d-90b3-4892-a404-bdd62fd30b6e: Claiming fa:16:3e:03:04:f6 10.100.0.5
Oct 01 14:26:46 compute-0 ovn_controller[94909]: 2025-10-01T14:26:46Z|00199|binding|INFO|Setting lport 4b5e6d9d-90b3-4892-a404-bdd62fd30b6e ovn-installed in OVS
Oct 01 14:26:46 compute-0 nova_compute[192698]: 2025-10-01 14:26:46.520 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:26:46 compute-0 nova_compute[192698]: 2025-10-01 14:26:46.523 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:26:46 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:26:46.528 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:03:04:f6 10.100.0.5'], port_security=['fa:16:3e:03:04:f6 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'ab7f70c7-9986-446d-8723-bf3a97689ca5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-031a8987-8430-4fb6-a464-01e4dca2fae7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd43115e3729442e1b68b749acc0dabc8', 'neutron:revision_number': '10', 'neutron:security_group_ids': '43a3232d-93b1-43af-a9a3-1fde49b4460d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bd1914da-f1b0-4097-9d6b-24a3870871dc, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[], logical_port=4b5e6d9d-90b3-4892-a404-bdd62fd30b6e) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:26:46 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:26:46.529 103791 INFO neutron.agent.ovn.metadata.agent [-] Port 4b5e6d9d-90b3-4892-a404-bdd62fd30b6e in datapath 031a8987-8430-4fb6-a464-01e4dca2fae7 unbound from our chassis
Oct 01 14:26:46 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:26:46.530 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 031a8987-8430-4fb6-a464-01e4dca2fae7
Oct 01 14:26:46 compute-0 systemd-udevd[225370]: Network interface NamePolicy= disabled on kernel command line.
Oct 01 14:26:46 compute-0 NetworkManager[51741]: <info>  [1759328806.5524] device (tap4b5e6d9d-90): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 01 14:26:46 compute-0 NetworkManager[51741]: <info>  [1759328806.5535] device (tap4b5e6d9d-90): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 01 14:26:46 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:26:46.553 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[798084f6-532f-449a-bdc3-210889a1731e]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:26:46 compute-0 systemd-machined[152704]: New machine qemu-19-instance-00000019.
Oct 01 14:26:46 compute-0 systemd[1]: Started Virtual Machine qemu-19-instance-00000019.
Oct 01 14:26:46 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:26:46.602 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[d6cec7da-3cb5-4cb4-be49-401e1828a6f9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:26:46 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:26:46.606 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[ad71f693-0d82-40d2-b795-bc0902a8cdc1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:26:46 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:26:46.648 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[ad35a3b6-cb1b-4095-a1c5-f4b1779bd6aa]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:26:46 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:26:46.667 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[36aa2fa2-e774-47b2-b2d6-d3e6822d5b17]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap031a8987-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:79:6c:81'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 5, 'rx_bytes': 1672, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 5, 'rx_bytes': 1672, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 57], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 513639, 'reachable_time': 27969, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225387, 'error': None, 'target': 'ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:26:46 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:26:46.684 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[ec95bb39-4477-419e-810b-31a07892e395]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap031a8987-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 513652, 'tstamp': 513652}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225388, 'error': None, 'target': 'ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap031a8987-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 513655, 'tstamp': 513655}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225388, 'error': None, 'target': 'ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:26:46 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:26:46.686 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap031a8987-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:26:46 compute-0 nova_compute[192698]: 2025-10-01 14:26:46.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:26:46 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:26:46.690 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap031a8987-80, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:26:46 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:26:46.691 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:26:46 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:26:46.691 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap031a8987-80, col_values=(('external_ids', {'iface-id': '6dd814dc-cba2-4392-85ef-eadb8c4615f7'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:26:46 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:26:46.691 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:26:46 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:26:46.693 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[4cf11310-0243-46ed-8fc5-d388c9c4f226]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-031a8987-8430-4fb6-a464-01e4dca2fae7\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 031a8987-8430-4fb6-a464-01e4dca2fae7\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:26:48 compute-0 unix_chkpwd[225409]: password check failed for user (root)
Oct 01 14:26:48 compute-0 sshd-session[225356]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=101.47.181.100  user=root
Oct 01 14:26:48 compute-0 ovn_controller[94909]: 2025-10-01T14:26:48Z|00200|binding|INFO|Claiming lport 4b5e6d9d-90b3-4892-a404-bdd62fd30b6e for this chassis.
Oct 01 14:26:48 compute-0 ovn_controller[94909]: 2025-10-01T14:26:48Z|00201|binding|INFO|4b5e6d9d-90b3-4892-a404-bdd62fd30b6e: Claiming fa:16:3e:03:04:f6 10.100.0.5
Oct 01 14:26:48 compute-0 ovn_controller[94909]: 2025-10-01T14:26:48Z|00202|binding|INFO|Setting lport 4b5e6d9d-90b3-4892-a404-bdd62fd30b6e up in Southbound
Oct 01 14:26:49 compute-0 nova_compute[192698]: 2025-10-01 14:26:49.429 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:26:50 compute-0 sshd-session[225356]: Failed password for root from 101.47.181.100 port 37156 ssh2
Oct 01 14:26:51 compute-0 nova_compute[192698]: 2025-10-01 14:26:51.501 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:26:51 compute-0 nova_compute[192698]: 2025-10-01 14:26:51.525 2 INFO nova.compute.manager [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ab7f70c7-9986-446d-8723-bf3a97689ca5] Post operation of migration started
Oct 01 14:26:51 compute-0 nova_compute[192698]: 2025-10-01 14:26:51.526 2 WARNING neutronclient.v2_0.client [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:26:52 compute-0 nova_compute[192698]: 2025-10-01 14:26:52.176 2 WARNING neutronclient.v2_0.client [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:26:52 compute-0 nova_compute[192698]: 2025-10-01 14:26:52.177 2 WARNING neutronclient.v2_0.client [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:26:52 compute-0 nova_compute[192698]: 2025-10-01 14:26:52.247 2 DEBUG oslo_concurrency.lockutils [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "refresh_cache-ab7f70c7-9986-446d-8723-bf3a97689ca5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 01 14:26:52 compute-0 nova_compute[192698]: 2025-10-01 14:26:52.248 2 DEBUG oslo_concurrency.lockutils [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquired lock "refresh_cache-ab7f70c7-9986-446d-8723-bf3a97689ca5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 01 14:26:52 compute-0 nova_compute[192698]: 2025-10-01 14:26:52.248 2 DEBUG nova.network.neutron [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ab7f70c7-9986-446d-8723-bf3a97689ca5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 01 14:26:52 compute-0 sshd-session[225356]: Connection closed by authenticating user root 101.47.181.100 port 37156 [preauth]
Oct 01 14:26:52 compute-0 nova_compute[192698]: 2025-10-01 14:26:52.760 2 WARNING neutronclient.v2_0.client [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:26:53 compute-0 podman[225410]: 2025-10-01 14:26:53.198783841 +0000 UTC m=+0.086140028 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, io.buildah.version=1.41.4)
Oct 01 14:26:53 compute-0 podman[225411]: 2025-10-01 14:26:53.255100356 +0000 UTC m=+0.140876631 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct 01 14:26:53 compute-0 nova_compute[192698]: 2025-10-01 14:26:53.739 2 WARNING neutronclient.v2_0.client [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:26:54 compute-0 nova_compute[192698]: 2025-10-01 14:26:54.433 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:26:54 compute-0 nova_compute[192698]: 2025-10-01 14:26:54.462 2 DEBUG nova.network.neutron [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ab7f70c7-9986-446d-8723-bf3a97689ca5] Updating instance_info_cache with network_info: [{"id": "4b5e6d9d-90b3-4892-a404-bdd62fd30b6e", "address": "fa:16:3e:03:04:f6", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b5e6d9d-90", "ovs_interfaceid": "4b5e6d9d-90b3-4892-a404-bdd62fd30b6e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:26:54 compute-0 nova_compute[192698]: 2025-10-01 14:26:54.970 2 DEBUG oslo_concurrency.lockutils [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Releasing lock "refresh_cache-ab7f70c7-9986-446d-8723-bf3a97689ca5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 01 14:26:55 compute-0 nova_compute[192698]: 2025-10-01 14:26:55.494 2 DEBUG oslo_concurrency.lockutils [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:26:55 compute-0 nova_compute[192698]: 2025-10-01 14:26:55.494 2 DEBUG oslo_concurrency.lockutils [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:26:55 compute-0 nova_compute[192698]: 2025-10-01 14:26:55.494 2 DEBUG oslo_concurrency.lockutils [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:26:55 compute-0 nova_compute[192698]: 2025-10-01 14:26:55.500 2 INFO nova.virt.libvirt.driver [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ab7f70c7-9986-446d-8723-bf3a97689ca5] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Oct 01 14:26:55 compute-0 virtqemud[192597]: Domain id=19 name='instance-00000019' uuid=ab7f70c7-9986-446d-8723-bf3a97689ca5 is tainted: custom-monitor
Oct 01 14:26:56 compute-0 nova_compute[192698]: 2025-10-01 14:26:56.508 2 INFO nova.virt.libvirt.driver [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ab7f70c7-9986-446d-8723-bf3a97689ca5] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Oct 01 14:26:56 compute-0 nova_compute[192698]: 2025-10-01 14:26:56.537 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:26:57 compute-0 nova_compute[192698]: 2025-10-01 14:26:57.515 2 INFO nova.virt.libvirt.driver [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ab7f70c7-9986-446d-8723-bf3a97689ca5] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Oct 01 14:26:57 compute-0 nova_compute[192698]: 2025-10-01 14:26:57.523 2 DEBUG nova.compute.manager [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ab7f70c7-9986-446d-8723-bf3a97689ca5] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 01 14:26:58 compute-0 nova_compute[192698]: 2025-10-01 14:26:58.036 2 DEBUG nova.objects.instance [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ab7f70c7-9986-446d-8723-bf3a97689ca5] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Oct 01 14:26:58 compute-0 unix_chkpwd[225459]: password check failed for user (root)
Oct 01 14:26:58 compute-0 sshd-session[225457]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=101.47.181.100  user=root
Oct 01 14:26:59 compute-0 nova_compute[192698]: 2025-10-01 14:26:59.057 2 WARNING neutronclient.v2_0.client [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:26:59 compute-0 nova_compute[192698]: 2025-10-01 14:26:59.295 2 WARNING neutronclient.v2_0.client [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:26:59 compute-0 nova_compute[192698]: 2025-10-01 14:26:59.296 2 WARNING neutronclient.v2_0.client [None req-7f3335db-deee-440a-b2f7-724acf87a151 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:26:59 compute-0 nova_compute[192698]: 2025-10-01 14:26:59.436 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:26:59 compute-0 podman[203144]: time="2025-10-01T14:26:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:26:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:26:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20750 "" "Go-http-client/1.1"
Oct 01 14:26:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:26:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3488 "" "Go-http-client/1.1"
Oct 01 14:27:00 compute-0 podman[225461]: 2025-10-01 14:27:00.186659123 +0000 UTC m=+0.093050384 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, build-date=2025-08-20T13:12:41, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, version=9.6, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct 01 14:27:00 compute-0 sshd-session[225457]: Failed password for root from 101.47.181.100 port 51334 ssh2
Oct 01 14:27:01 compute-0 openstack_network_exporter[205307]: ERROR   14:27:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:27:01 compute-0 openstack_network_exporter[205307]: ERROR   14:27:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:27:01 compute-0 openstack_network_exporter[205307]: ERROR   14:27:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:27:01 compute-0 openstack_network_exporter[205307]: ERROR   14:27:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:27:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:27:01 compute-0 openstack_network_exporter[205307]: ERROR   14:27:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:27:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:27:01 compute-0 nova_compute[192698]: 2025-10-01 14:27:01.538 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:27:02 compute-0 sshd-session[225457]: Connection closed by authenticating user root 101.47.181.100 port 51334 [preauth]
Oct 01 14:27:02 compute-0 nova_compute[192698]: 2025-10-01 14:27:02.471 2 DEBUG oslo_concurrency.lockutils [None req-7bf29187-ef82-4108-87c3-09cafde904e0 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Acquiring lock "ab7f70c7-9986-446d-8723-bf3a97689ca5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:27:02 compute-0 nova_compute[192698]: 2025-10-01 14:27:02.471 2 DEBUG oslo_concurrency.lockutils [None req-7bf29187-ef82-4108-87c3-09cafde904e0 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "ab7f70c7-9986-446d-8723-bf3a97689ca5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:27:02 compute-0 nova_compute[192698]: 2025-10-01 14:27:02.472 2 DEBUG oslo_concurrency.lockutils [None req-7bf29187-ef82-4108-87c3-09cafde904e0 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Acquiring lock "ab7f70c7-9986-446d-8723-bf3a97689ca5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:27:02 compute-0 nova_compute[192698]: 2025-10-01 14:27:02.472 2 DEBUG oslo_concurrency.lockutils [None req-7bf29187-ef82-4108-87c3-09cafde904e0 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "ab7f70c7-9986-446d-8723-bf3a97689ca5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:27:02 compute-0 nova_compute[192698]: 2025-10-01 14:27:02.472 2 DEBUG oslo_concurrency.lockutils [None req-7bf29187-ef82-4108-87c3-09cafde904e0 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "ab7f70c7-9986-446d-8723-bf3a97689ca5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:27:02 compute-0 nova_compute[192698]: 2025-10-01 14:27:02.507 2 INFO nova.compute.manager [None req-7bf29187-ef82-4108-87c3-09cafde904e0 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: ab7f70c7-9986-446d-8723-bf3a97689ca5] Terminating instance
Oct 01 14:27:03 compute-0 nova_compute[192698]: 2025-10-01 14:27:03.095 2 DEBUG nova.compute.manager [None req-7bf29187-ef82-4108-87c3-09cafde904e0 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: ab7f70c7-9986-446d-8723-bf3a97689ca5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 01 14:27:03 compute-0 kernel: tap4b5e6d9d-90 (unregistering): left promiscuous mode
Oct 01 14:27:03 compute-0 NetworkManager[51741]: <info>  [1759328823.1202] device (tap4b5e6d9d-90): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 01 14:27:03 compute-0 nova_compute[192698]: 2025-10-01 14:27:03.132 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:27:03 compute-0 ovn_controller[94909]: 2025-10-01T14:27:03Z|00203|binding|INFO|Releasing lport 4b5e6d9d-90b3-4892-a404-bdd62fd30b6e from this chassis (sb_readonly=0)
Oct 01 14:27:03 compute-0 ovn_controller[94909]: 2025-10-01T14:27:03Z|00204|binding|INFO|Setting lport 4b5e6d9d-90b3-4892-a404-bdd62fd30b6e down in Southbound
Oct 01 14:27:03 compute-0 ovn_controller[94909]: 2025-10-01T14:27:03Z|00205|binding|INFO|Removing iface tap4b5e6d9d-90 ovn-installed in OVS
Oct 01 14:27:03 compute-0 nova_compute[192698]: 2025-10-01 14:27:03.137 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:27:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:27:03.146 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:03:04:f6 10.100.0.5'], port_security=['fa:16:3e:03:04:f6 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'ab7f70c7-9986-446d-8723-bf3a97689ca5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-031a8987-8430-4fb6-a464-01e4dca2fae7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd43115e3729442e1b68b749acc0dabc8', 'neutron:revision_number': '15', 'neutron:security_group_ids': '43a3232d-93b1-43af-a9a3-1fde49b4460d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bd1914da-f1b0-4097-9d6b-24a3870871dc, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], logical_port=4b5e6d9d-90b3-4892-a404-bdd62fd30b6e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:27:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:27:03.147 103791 INFO neutron.agent.ovn.metadata.agent [-] Port 4b5e6d9d-90b3-4892-a404-bdd62fd30b6e in datapath 031a8987-8430-4fb6-a464-01e4dca2fae7 unbound from our chassis
Oct 01 14:27:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:27:03.149 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 031a8987-8430-4fb6-a464-01e4dca2fae7
Oct 01 14:27:03 compute-0 nova_compute[192698]: 2025-10-01 14:27:03.167 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:27:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:27:03.179 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[e049a324-9f02-4aa7-bb43-46c0b3d458cc]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:27:03 compute-0 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000019.scope: Deactivated successfully.
Oct 01 14:27:03 compute-0 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000019.scope: Consumed 2.013s CPU time.
Oct 01 14:27:03 compute-0 systemd-machined[152704]: Machine qemu-19-instance-00000019 terminated.
Oct 01 14:27:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:27:03.231 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[9bed087a-1742-433b-9ee9-ddb31784b858]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:27:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:27:03.235 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[a46e4c6d-2a06-436e-98c9-e814d53fb69d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:27:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:27:03.277 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[136cf943-10e5-42e0-badf-289652a1e9e2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:27:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:27:03.304 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[d604d875-5595-42b2-be76-6556dd703187]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap031a8987-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:79:6c:81'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 48, 'tx_packets': 7, 'rx_bytes': 2512, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 48, 'tx_packets': 7, 'rx_bytes': 2512, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 57], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 513639, 'reachable_time': 27969, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225493, 'error': None, 'target': 'ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:27:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:27:03.329 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[11c200e7-3558-4678-9932-df9fa4a3aca6]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap031a8987-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 513652, 'tstamp': 513652}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225496, 'error': None, 'target': 'ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap031a8987-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 513655, 'tstamp': 513655}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225496, 'error': None, 'target': 'ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:27:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:27:03.330 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap031a8987-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:27:03 compute-0 nova_compute[192698]: 2025-10-01 14:27:03.333 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:27:03 compute-0 nova_compute[192698]: 2025-10-01 14:27:03.341 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:27:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:27:03.342 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap031a8987-80, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:27:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:27:03.342 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:27:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:27:03.343 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap031a8987-80, col_values=(('external_ids', {'iface-id': '6dd814dc-cba2-4392-85ef-eadb8c4615f7'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:27:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:27:03.343 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:27:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:27:03.344 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[8ec40b3a-b71a-400b-a419-6acc02ce5875]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-031a8987-8430-4fb6-a464-01e4dca2fae7\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 031a8987-8430-4fb6-a464-01e4dca2fae7\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:27:03 compute-0 nova_compute[192698]: 2025-10-01 14:27:03.369 2 INFO nova.virt.libvirt.driver [-] [instance: ab7f70c7-9986-446d-8723-bf3a97689ca5] Instance destroyed successfully.
Oct 01 14:27:03 compute-0 nova_compute[192698]: 2025-10-01 14:27:03.370 2 DEBUG nova.objects.instance [None req-7bf29187-ef82-4108-87c3-09cafde904e0 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lazy-loading 'resources' on Instance uuid ab7f70c7-9986-446d-8723-bf3a97689ca5 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:27:03 compute-0 nova_compute[192698]: 2025-10-01 14:27:03.592 2 DEBUG nova.compute.manager [req-9159d970-5507-4225-9411-1c300a4aa2a6 req-1b24e88e-0e3d-4ae7-b184-813176f28a28 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ab7f70c7-9986-446d-8723-bf3a97689ca5] Received event network-vif-unplugged-4b5e6d9d-90b3-4892-a404-bdd62fd30b6e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:27:03 compute-0 nova_compute[192698]: 2025-10-01 14:27:03.593 2 DEBUG oslo_concurrency.lockutils [req-9159d970-5507-4225-9411-1c300a4aa2a6 req-1b24e88e-0e3d-4ae7-b184-813176f28a28 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "ab7f70c7-9986-446d-8723-bf3a97689ca5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:27:03 compute-0 nova_compute[192698]: 2025-10-01 14:27:03.593 2 DEBUG oslo_concurrency.lockutils [req-9159d970-5507-4225-9411-1c300a4aa2a6 req-1b24e88e-0e3d-4ae7-b184-813176f28a28 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "ab7f70c7-9986-446d-8723-bf3a97689ca5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:27:03 compute-0 nova_compute[192698]: 2025-10-01 14:27:03.593 2 DEBUG oslo_concurrency.lockutils [req-9159d970-5507-4225-9411-1c300a4aa2a6 req-1b24e88e-0e3d-4ae7-b184-813176f28a28 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "ab7f70c7-9986-446d-8723-bf3a97689ca5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:27:03 compute-0 nova_compute[192698]: 2025-10-01 14:27:03.593 2 DEBUG nova.compute.manager [req-9159d970-5507-4225-9411-1c300a4aa2a6 req-1b24e88e-0e3d-4ae7-b184-813176f28a28 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ab7f70c7-9986-446d-8723-bf3a97689ca5] No waiting events found dispatching network-vif-unplugged-4b5e6d9d-90b3-4892-a404-bdd62fd30b6e pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:27:03 compute-0 nova_compute[192698]: 2025-10-01 14:27:03.593 2 DEBUG nova.compute.manager [req-9159d970-5507-4225-9411-1c300a4aa2a6 req-1b24e88e-0e3d-4ae7-b184-813176f28a28 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ab7f70c7-9986-446d-8723-bf3a97689ca5] Received event network-vif-unplugged-4b5e6d9d-90b3-4892-a404-bdd62fd30b6e for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 01 14:27:03 compute-0 nova_compute[192698]: 2025-10-01 14:27:03.877 2 DEBUG nova.virt.libvirt.vif [None req-7bf29187-ef82-4108-87c3-09cafde904e0 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-10-01T14:25:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-879607803',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-879607803',id=25,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-01T14:25:28Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d43115e3729442e1b68b749acc0dabc8',ramdisk_id='',reservation_id='r-ypwof23w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',clean_attempts='1',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-30131345',owner_user_name='tempest-TestExecuteStrategies-30131345-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-01T14:26:58Z,user_data=None,user_id='f8897741e6ca4770b56d28d05fa3fc42',uuid=ab7f70c7-9986-446d-8723-bf3a97689ca5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4b5e6d9d-90b3-4892-a404-bdd62fd30b6e", "address": "fa:16:3e:03:04:f6", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b5e6d9d-90", "ovs_interfaceid": "4b5e6d9d-90b3-4892-a404-bdd62fd30b6e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 01 14:27:03 compute-0 nova_compute[192698]: 2025-10-01 14:27:03.877 2 DEBUG nova.network.os_vif_util [None req-7bf29187-ef82-4108-87c3-09cafde904e0 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Converting VIF {"id": "4b5e6d9d-90b3-4892-a404-bdd62fd30b6e", "address": "fa:16:3e:03:04:f6", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b5e6d9d-90", "ovs_interfaceid": "4b5e6d9d-90b3-4892-a404-bdd62fd30b6e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:27:03 compute-0 nova_compute[192698]: 2025-10-01 14:27:03.878 2 DEBUG nova.network.os_vif_util [None req-7bf29187-ef82-4108-87c3-09cafde904e0 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:03:04:f6,bridge_name='br-int',has_traffic_filtering=True,id=4b5e6d9d-90b3-4892-a404-bdd62fd30b6e,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b5e6d9d-90') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:27:03 compute-0 nova_compute[192698]: 2025-10-01 14:27:03.878 2 DEBUG os_vif [None req-7bf29187-ef82-4108-87c3-09cafde904e0 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:03:04:f6,bridge_name='br-int',has_traffic_filtering=True,id=4b5e6d9d-90b3-4892-a404-bdd62fd30b6e,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b5e6d9d-90') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 01 14:27:03 compute-0 nova_compute[192698]: 2025-10-01 14:27:03.879 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:27:03 compute-0 nova_compute[192698]: 2025-10-01 14:27:03.879 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4b5e6d9d-90, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:27:03 compute-0 nova_compute[192698]: 2025-10-01 14:27:03.881 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:27:03 compute-0 nova_compute[192698]: 2025-10-01 14:27:03.884 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:27:03 compute-0 nova_compute[192698]: 2025-10-01 14:27:03.886 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:27:03 compute-0 nova_compute[192698]: 2025-10-01 14:27:03.886 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=1faf2fa3-9a3d-4471-a5b2-189e69b445e0) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:27:03 compute-0 nova_compute[192698]: 2025-10-01 14:27:03.887 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:27:03 compute-0 nova_compute[192698]: 2025-10-01 14:27:03.888 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:27:03 compute-0 nova_compute[192698]: 2025-10-01 14:27:03.891 2 INFO os_vif [None req-7bf29187-ef82-4108-87c3-09cafde904e0 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:03:04:f6,bridge_name='br-int',has_traffic_filtering=True,id=4b5e6d9d-90b3-4892-a404-bdd62fd30b6e,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b5e6d9d-90')
Oct 01 14:27:03 compute-0 nova_compute[192698]: 2025-10-01 14:27:03.892 2 INFO nova.virt.libvirt.driver [None req-7bf29187-ef82-4108-87c3-09cafde904e0 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: ab7f70c7-9986-446d-8723-bf3a97689ca5] Deleting instance files /var/lib/nova/instances/ab7f70c7-9986-446d-8723-bf3a97689ca5_del
Oct 01 14:27:03 compute-0 nova_compute[192698]: 2025-10-01 14:27:03.893 2 INFO nova.virt.libvirt.driver [None req-7bf29187-ef82-4108-87c3-09cafde904e0 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: ab7f70c7-9986-446d-8723-bf3a97689ca5] Deletion of /var/lib/nova/instances/ab7f70c7-9986-446d-8723-bf3a97689ca5_del complete
Oct 01 14:27:04 compute-0 nova_compute[192698]: 2025-10-01 14:27:04.411 2 INFO nova.compute.manager [None req-7bf29187-ef82-4108-87c3-09cafde904e0 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: ab7f70c7-9986-446d-8723-bf3a97689ca5] Took 1.32 seconds to destroy the instance on the hypervisor.
Oct 01 14:27:04 compute-0 nova_compute[192698]: 2025-10-01 14:27:04.411 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-7bf29187-ef82-4108-87c3-09cafde904e0 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 01 14:27:04 compute-0 nova_compute[192698]: 2025-10-01 14:27:04.412 2 DEBUG nova.compute.manager [-] [instance: ab7f70c7-9986-446d-8723-bf3a97689ca5] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 01 14:27:04 compute-0 nova_compute[192698]: 2025-10-01 14:27:04.412 2 DEBUG nova.network.neutron [-] [instance: ab7f70c7-9986-446d-8723-bf3a97689ca5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 01 14:27:04 compute-0 nova_compute[192698]: 2025-10-01 14:27:04.413 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:27:04 compute-0 nova_compute[192698]: 2025-10-01 14:27:04.440 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:27:05 compute-0 nova_compute[192698]: 2025-10-01 14:27:05.274 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:27:05 compute-0 nova_compute[192698]: 2025-10-01 14:27:05.694 2 DEBUG nova.compute.manager [req-8363be67-8f7d-4727-be63-cd68d5fcfd42 req-234d955b-f39b-48c0-a44e-22fcbaa76aaa 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ab7f70c7-9986-446d-8723-bf3a97689ca5] Received event network-vif-unplugged-4b5e6d9d-90b3-4892-a404-bdd62fd30b6e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:27:05 compute-0 nova_compute[192698]: 2025-10-01 14:27:05.694 2 DEBUG oslo_concurrency.lockutils [req-8363be67-8f7d-4727-be63-cd68d5fcfd42 req-234d955b-f39b-48c0-a44e-22fcbaa76aaa 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "ab7f70c7-9986-446d-8723-bf3a97689ca5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:27:05 compute-0 nova_compute[192698]: 2025-10-01 14:27:05.694 2 DEBUG oslo_concurrency.lockutils [req-8363be67-8f7d-4727-be63-cd68d5fcfd42 req-234d955b-f39b-48c0-a44e-22fcbaa76aaa 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "ab7f70c7-9986-446d-8723-bf3a97689ca5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:27:05 compute-0 nova_compute[192698]: 2025-10-01 14:27:05.695 2 DEBUG oslo_concurrency.lockutils [req-8363be67-8f7d-4727-be63-cd68d5fcfd42 req-234d955b-f39b-48c0-a44e-22fcbaa76aaa 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "ab7f70c7-9986-446d-8723-bf3a97689ca5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:27:05 compute-0 nova_compute[192698]: 2025-10-01 14:27:05.695 2 DEBUG nova.compute.manager [req-8363be67-8f7d-4727-be63-cd68d5fcfd42 req-234d955b-f39b-48c0-a44e-22fcbaa76aaa 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ab7f70c7-9986-446d-8723-bf3a97689ca5] No waiting events found dispatching network-vif-unplugged-4b5e6d9d-90b3-4892-a404-bdd62fd30b6e pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:27:05 compute-0 nova_compute[192698]: 2025-10-01 14:27:05.695 2 DEBUG nova.compute.manager [req-8363be67-8f7d-4727-be63-cd68d5fcfd42 req-234d955b-f39b-48c0-a44e-22fcbaa76aaa 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ab7f70c7-9986-446d-8723-bf3a97689ca5] Received event network-vif-unplugged-4b5e6d9d-90b3-4892-a404-bdd62fd30b6e for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 01 14:27:06 compute-0 nova_compute[192698]: 2025-10-01 14:27:06.012 2 DEBUG nova.compute.manager [req-c0f4a63f-b79a-4c6c-9ea1-85c50ba6943f req-19c08c77-dc39-44d1-a51f-995e1f1fb0c7 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ab7f70c7-9986-446d-8723-bf3a97689ca5] Received event network-vif-deleted-4b5e6d9d-90b3-4892-a404-bdd62fd30b6e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:27:06 compute-0 nova_compute[192698]: 2025-10-01 14:27:06.014 2 INFO nova.compute.manager [req-c0f4a63f-b79a-4c6c-9ea1-85c50ba6943f req-19c08c77-dc39-44d1-a51f-995e1f1fb0c7 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ab7f70c7-9986-446d-8723-bf3a97689ca5] Neutron deleted interface 4b5e6d9d-90b3-4892-a404-bdd62fd30b6e; detaching it from the instance and deleting it from the info cache
Oct 01 14:27:06 compute-0 nova_compute[192698]: 2025-10-01 14:27:06.014 2 DEBUG nova.network.neutron [req-c0f4a63f-b79a-4c6c-9ea1-85c50ba6943f req-19c08c77-dc39-44d1-a51f-995e1f1fb0c7 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ab7f70c7-9986-446d-8723-bf3a97689ca5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:27:06 compute-0 nova_compute[192698]: 2025-10-01 14:27:06.438 2 DEBUG nova.network.neutron [-] [instance: ab7f70c7-9986-446d-8723-bf3a97689ca5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:27:06 compute-0 nova_compute[192698]: 2025-10-01 14:27:06.525 2 DEBUG nova.compute.manager [req-c0f4a63f-b79a-4c6c-9ea1-85c50ba6943f req-19c08c77-dc39-44d1-a51f-995e1f1fb0c7 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: ab7f70c7-9986-446d-8723-bf3a97689ca5] Detach interface failed, port_id=4b5e6d9d-90b3-4892-a404-bdd62fd30b6e, reason: Instance ab7f70c7-9986-446d-8723-bf3a97689ca5 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Oct 01 14:27:06 compute-0 nova_compute[192698]: 2025-10-01 14:27:06.947 2 INFO nova.compute.manager [-] [instance: ab7f70c7-9986-446d-8723-bf3a97689ca5] Took 2.53 seconds to deallocate network for instance.
Oct 01 14:27:07 compute-0 podman[225516]: 2025-10-01 14:27:07.170290901 +0000 UTC m=+0.078634366 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 01 14:27:07 compute-0 podman[225515]: 2025-10-01 14:27:07.181716669 +0000 UTC m=+0.088764439 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20250930)
Oct 01 14:27:07 compute-0 nova_compute[192698]: 2025-10-01 14:27:07.470 2 DEBUG oslo_concurrency.lockutils [None req-7bf29187-ef82-4108-87c3-09cafde904e0 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:27:07 compute-0 nova_compute[192698]: 2025-10-01 14:27:07.471 2 DEBUG oslo_concurrency.lockutils [None req-7bf29187-ef82-4108-87c3-09cafde904e0 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:27:07 compute-0 nova_compute[192698]: 2025-10-01 14:27:07.478 2 DEBUG oslo_concurrency.lockutils [None req-7bf29187-ef82-4108-87c3-09cafde904e0 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.007s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:27:07 compute-0 nova_compute[192698]: 2025-10-01 14:27:07.520 2 INFO nova.scheduler.client.report [None req-7bf29187-ef82-4108-87c3-09cafde904e0 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Deleted allocations for instance ab7f70c7-9986-446d-8723-bf3a97689ca5
Oct 01 14:27:08 compute-0 nova_compute[192698]: 2025-10-01 14:27:08.624 2 DEBUG oslo_concurrency.lockutils [None req-7bf29187-ef82-4108-87c3-09cafde904e0 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "ab7f70c7-9986-446d-8723-bf3a97689ca5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.152s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:27:08 compute-0 nova_compute[192698]: 2025-10-01 14:27:08.888 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:27:09 compute-0 nova_compute[192698]: 2025-10-01 14:27:09.110 2 DEBUG oslo_concurrency.lockutils [None req-5d21a2d3-786f-4052-a1f7-9c24591302ed f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Acquiring lock "31943dad-f83c-4e41-8ca4-baac8a025255" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:27:09 compute-0 nova_compute[192698]: 2025-10-01 14:27:09.111 2 DEBUG oslo_concurrency.lockutils [None req-5d21a2d3-786f-4052-a1f7-9c24591302ed f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "31943dad-f83c-4e41-8ca4-baac8a025255" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:27:09 compute-0 nova_compute[192698]: 2025-10-01 14:27:09.111 2 DEBUG oslo_concurrency.lockutils [None req-5d21a2d3-786f-4052-a1f7-9c24591302ed f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Acquiring lock "31943dad-f83c-4e41-8ca4-baac8a025255-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:27:09 compute-0 nova_compute[192698]: 2025-10-01 14:27:09.112 2 DEBUG oslo_concurrency.lockutils [None req-5d21a2d3-786f-4052-a1f7-9c24591302ed f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "31943dad-f83c-4e41-8ca4-baac8a025255-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:27:09 compute-0 nova_compute[192698]: 2025-10-01 14:27:09.112 2 DEBUG oslo_concurrency.lockutils [None req-5d21a2d3-786f-4052-a1f7-9c24591302ed f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "31943dad-f83c-4e41-8ca4-baac8a025255-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:27:09 compute-0 nova_compute[192698]: 2025-10-01 14:27:09.191 2 INFO nova.compute.manager [None req-5d21a2d3-786f-4052-a1f7-9c24591302ed f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 31943dad-f83c-4e41-8ca4-baac8a025255] Terminating instance
Oct 01 14:27:09 compute-0 nova_compute[192698]: 2025-10-01 14:27:09.441 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:27:09 compute-0 nova_compute[192698]: 2025-10-01 14:27:09.768 2 DEBUG nova.compute.manager [None req-5d21a2d3-786f-4052-a1f7-9c24591302ed f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 31943dad-f83c-4e41-8ca4-baac8a025255] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 01 14:27:09 compute-0 kernel: tapec55d7c2-a6 (unregistering): left promiscuous mode
Oct 01 14:27:09 compute-0 NetworkManager[51741]: <info>  [1759328829.8758] device (tapec55d7c2-a6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 01 14:27:09 compute-0 ovn_controller[94909]: 2025-10-01T14:27:09Z|00206|binding|INFO|Releasing lport ec55d7c2-a601-4f1d-9c15-3aad37d2db0c from this chassis (sb_readonly=0)
Oct 01 14:27:09 compute-0 ovn_controller[94909]: 2025-10-01T14:27:09Z|00207|binding|INFO|Setting lport ec55d7c2-a601-4f1d-9c15-3aad37d2db0c down in Southbound
Oct 01 14:27:09 compute-0 nova_compute[192698]: 2025-10-01 14:27:09.885 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:27:09 compute-0 ovn_controller[94909]: 2025-10-01T14:27:09Z|00208|binding|INFO|Removing iface tapec55d7c2-a6 ovn-installed in OVS
Oct 01 14:27:09 compute-0 nova_compute[192698]: 2025-10-01 14:27:09.888 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:27:09 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:27:09.898 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e8:fc:97 10.100.0.12'], port_security=['fa:16:3e:e8:fc:97 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '31943dad-f83c-4e41-8ca4-baac8a025255', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-031a8987-8430-4fb6-a464-01e4dca2fae7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd43115e3729442e1b68b749acc0dabc8', 'neutron:revision_number': '16', 'neutron:security_group_ids': '43a3232d-93b1-43af-a9a3-1fde49b4460d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bd1914da-f1b0-4097-9d6b-24a3870871dc, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], logical_port=ec55d7c2-a601-4f1d-9c15-3aad37d2db0c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:27:09 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:27:09.900 103791 INFO neutron.agent.ovn.metadata.agent [-] Port ec55d7c2-a601-4f1d-9c15-3aad37d2db0c in datapath 031a8987-8430-4fb6-a464-01e4dca2fae7 unbound from our chassis
Oct 01 14:27:09 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:27:09.901 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 031a8987-8430-4fb6-a464-01e4dca2fae7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 01 14:27:09 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:27:09.902 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[c8b1c8af-6400-44c7-a6e3-0ba8bc2c6492]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:27:09 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:27:09.903 103791 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7 namespace which is not needed anymore
Oct 01 14:27:09 compute-0 nova_compute[192698]: 2025-10-01 14:27:09.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:27:09 compute-0 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000018.scope: Deactivated successfully.
Oct 01 14:27:09 compute-0 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000018.scope: Consumed 4.212s CPU time.
Oct 01 14:27:09 compute-0 systemd-machined[152704]: Machine qemu-18-instance-00000018 terminated.
Oct 01 14:27:10 compute-0 neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7[225164]: [NOTICE]   (225168) : haproxy version is 3.0.5-8e879a5
Oct 01 14:27:10 compute-0 neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7[225164]: [NOTICE]   (225168) : path to executable is /usr/sbin/haproxy
Oct 01 14:27:10 compute-0 neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7[225164]: [WARNING]  (225168) : Exiting Master process...
Oct 01 14:27:10 compute-0 podman[225584]: 2025-10-01 14:27:10.040557889 +0000 UTC m=+0.038999850 container kill 08ef45f89e005a028e44ede7a2704b502cbd512faf534161220bb56c9eaebeb5 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4)
Oct 01 14:27:10 compute-0 neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7[225164]: [ALERT]    (225168) : Current worker (225170) exited with code 143 (Terminated)
Oct 01 14:27:10 compute-0 neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7[225164]: [WARNING]  (225168) : All workers exited. Exiting... (0)
Oct 01 14:27:10 compute-0 systemd[1]: libpod-08ef45f89e005a028e44ede7a2704b502cbd512faf534161220bb56c9eaebeb5.scope: Deactivated successfully.
Oct 01 14:27:10 compute-0 nova_compute[192698]: 2025-10-01 14:27:10.046 2 INFO nova.virt.libvirt.driver [-] [instance: 31943dad-f83c-4e41-8ca4-baac8a025255] Instance destroyed successfully.
Oct 01 14:27:10 compute-0 nova_compute[192698]: 2025-10-01 14:27:10.047 2 DEBUG nova.objects.instance [None req-5d21a2d3-786f-4052-a1f7-9c24591302ed f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lazy-loading 'resources' on Instance uuid 31943dad-f83c-4e41-8ca4-baac8a025255 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:27:10 compute-0 podman[225612]: 2025-10-01 14:27:10.168564043 +0000 UTC m=+0.109954400 container died 08ef45f89e005a028e44ede7a2704b502cbd512faf534161220bb56c9eaebeb5 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Oct 01 14:27:10 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-08ef45f89e005a028e44ede7a2704b502cbd512faf534161220bb56c9eaebeb5-userdata-shm.mount: Deactivated successfully.
Oct 01 14:27:10 compute-0 systemd[1]: var-lib-containers-storage-overlay-5b55685413169f0fafa3eb1c92b5043ca617fe603e6aef931944109ccd87d9a0-merged.mount: Deactivated successfully.
Oct 01 14:27:10 compute-0 podman[225612]: 2025-10-01 14:27:10.476919758 +0000 UTC m=+0.418310085 container cleanup 08ef45f89e005a028e44ede7a2704b502cbd512faf534161220bb56c9eaebeb5 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest)
Oct 01 14:27:10 compute-0 systemd[1]: libpod-conmon-08ef45f89e005a028e44ede7a2704b502cbd512faf534161220bb56c9eaebeb5.scope: Deactivated successfully.
Oct 01 14:27:10 compute-0 nova_compute[192698]: 2025-10-01 14:27:10.560 2 DEBUG nova.virt.libvirt.vif [None req-5d21a2d3-786f-4052-a1f7-9c24591302ed f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-10-01T14:24:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-776027965',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-776027965',id=24,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-01T14:25:08Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d43115e3729442e1b68b749acc0dabc8',ramdisk_id='',reservation_id='r-8whbp6r7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',clean_attempts='1',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-30131345',owner_user_name='tempest-TestExecuteStrategies-30131345-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-01T14:26:26Z,user_data=None,user_id='f8897741e6ca4770b56d28d05fa3fc42',uuid=31943dad-f83c-4e41-8ca4-baac8a025255,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ec55d7c2-a601-4f1d-9c15-3aad37d2db0c", "address": "fa:16:3e:e8:fc:97", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec55d7c2-a6", "ovs_interfaceid": "ec55d7c2-a601-4f1d-9c15-3aad37d2db0c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 01 14:27:10 compute-0 nova_compute[192698]: 2025-10-01 14:27:10.561 2 DEBUG nova.network.os_vif_util [None req-5d21a2d3-786f-4052-a1f7-9c24591302ed f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Converting VIF {"id": "ec55d7c2-a601-4f1d-9c15-3aad37d2db0c", "address": "fa:16:3e:e8:fc:97", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapec55d7c2-a6", "ovs_interfaceid": "ec55d7c2-a601-4f1d-9c15-3aad37d2db0c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:27:10 compute-0 nova_compute[192698]: 2025-10-01 14:27:10.562 2 DEBUG nova.network.os_vif_util [None req-5d21a2d3-786f-4052-a1f7-9c24591302ed f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e8:fc:97,bridge_name='br-int',has_traffic_filtering=True,id=ec55d7c2-a601-4f1d-9c15-3aad37d2db0c,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec55d7c2-a6') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:27:10 compute-0 nova_compute[192698]: 2025-10-01 14:27:10.563 2 DEBUG os_vif [None req-5d21a2d3-786f-4052-a1f7-9c24591302ed f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e8:fc:97,bridge_name='br-int',has_traffic_filtering=True,id=ec55d7c2-a601-4f1d-9c15-3aad37d2db0c,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec55d7c2-a6') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 01 14:27:10 compute-0 nova_compute[192698]: 2025-10-01 14:27:10.565 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:27:10 compute-0 nova_compute[192698]: 2025-10-01 14:27:10.566 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapec55d7c2-a6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:27:10 compute-0 nova_compute[192698]: 2025-10-01 14:27:10.568 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:27:10 compute-0 nova_compute[192698]: 2025-10-01 14:27:10.570 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:27:10 compute-0 nova_compute[192698]: 2025-10-01 14:27:10.572 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:27:10 compute-0 nova_compute[192698]: 2025-10-01 14:27:10.572 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=012ac828-2d1c-43f6-a670-cadefd7763c0) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:27:10 compute-0 nova_compute[192698]: 2025-10-01 14:27:10.573 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:27:10 compute-0 nova_compute[192698]: 2025-10-01 14:27:10.575 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:27:10 compute-0 nova_compute[192698]: 2025-10-01 14:27:10.578 2 INFO os_vif [None req-5d21a2d3-786f-4052-a1f7-9c24591302ed f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e8:fc:97,bridge_name='br-int',has_traffic_filtering=True,id=ec55d7c2-a601-4f1d-9c15-3aad37d2db0c,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapec55d7c2-a6')
Oct 01 14:27:10 compute-0 nova_compute[192698]: 2025-10-01 14:27:10.579 2 INFO nova.virt.libvirt.driver [None req-5d21a2d3-786f-4052-a1f7-9c24591302ed f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 31943dad-f83c-4e41-8ca4-baac8a025255] Deleting instance files /var/lib/nova/instances/31943dad-f83c-4e41-8ca4-baac8a025255_del
Oct 01 14:27:10 compute-0 nova_compute[192698]: 2025-10-01 14:27:10.579 2 INFO nova.virt.libvirt.driver [None req-5d21a2d3-786f-4052-a1f7-9c24591302ed f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 31943dad-f83c-4e41-8ca4-baac8a025255] Deletion of /var/lib/nova/instances/31943dad-f83c-4e41-8ca4-baac8a025255_del complete
Oct 01 14:27:10 compute-0 podman[225624]: 2025-10-01 14:27:10.593877355 +0000 UTC m=+0.414365369 container remove 08ef45f89e005a028e44ede7a2704b502cbd512faf534161220bb56c9eaebeb5 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7, org.label-schema.build-date=20250930, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 01 14:27:10 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:27:10.605 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[f3fa03dc-85d8-4587-a18b-56d905166e45]: (4, ("Wed Oct  1 02:27:09 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7 (08ef45f89e005a028e44ede7a2704b502cbd512faf534161220bb56c9eaebeb5)\n08ef45f89e005a028e44ede7a2704b502cbd512faf534161220bb56c9eaebeb5\nWed Oct  1 02:27:10 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7 (08ef45f89e005a028e44ede7a2704b502cbd512faf534161220bb56c9eaebeb5)\n08ef45f89e005a028e44ede7a2704b502cbd512faf534161220bb56c9eaebeb5\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:27:10 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:27:10.607 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[fc2898b7-853c-4275-b66d-372ea10cc080]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:27:10 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:27:10.608 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:27:10 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:27:10.609 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[386d2b5c-5c96-4bcc-b67d-14a5d4a57f7b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:27:10 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:27:10.610 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap031a8987-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:27:10 compute-0 nova_compute[192698]: 2025-10-01 14:27:10.613 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:27:10 compute-0 kernel: tap031a8987-80: left promiscuous mode
Oct 01 14:27:10 compute-0 nova_compute[192698]: 2025-10-01 14:27:10.640 2 DEBUG nova.compute.manager [req-d0ca3d30-3b7e-4278-b76c-06e3fa1cb562 req-f50e813d-c3cc-4e1b-8616-d2193d9db6f0 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 31943dad-f83c-4e41-8ca4-baac8a025255] Received event network-vif-unplugged-ec55d7c2-a601-4f1d-9c15-3aad37d2db0c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:27:10 compute-0 nova_compute[192698]: 2025-10-01 14:27:10.641 2 DEBUG oslo_concurrency.lockutils [req-d0ca3d30-3b7e-4278-b76c-06e3fa1cb562 req-f50e813d-c3cc-4e1b-8616-d2193d9db6f0 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "31943dad-f83c-4e41-8ca4-baac8a025255-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:27:10 compute-0 nova_compute[192698]: 2025-10-01 14:27:10.641 2 DEBUG oslo_concurrency.lockutils [req-d0ca3d30-3b7e-4278-b76c-06e3fa1cb562 req-f50e813d-c3cc-4e1b-8616-d2193d9db6f0 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "31943dad-f83c-4e41-8ca4-baac8a025255-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:27:10 compute-0 nova_compute[192698]: 2025-10-01 14:27:10.642 2 DEBUG oslo_concurrency.lockutils [req-d0ca3d30-3b7e-4278-b76c-06e3fa1cb562 req-f50e813d-c3cc-4e1b-8616-d2193d9db6f0 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "31943dad-f83c-4e41-8ca4-baac8a025255-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:27:10 compute-0 nova_compute[192698]: 2025-10-01 14:27:10.642 2 DEBUG nova.compute.manager [req-d0ca3d30-3b7e-4278-b76c-06e3fa1cb562 req-f50e813d-c3cc-4e1b-8616-d2193d9db6f0 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 31943dad-f83c-4e41-8ca4-baac8a025255] No waiting events found dispatching network-vif-unplugged-ec55d7c2-a601-4f1d-9c15-3aad37d2db0c pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:27:10 compute-0 nova_compute[192698]: 2025-10-01 14:27:10.643 2 DEBUG nova.compute.manager [req-d0ca3d30-3b7e-4278-b76c-06e3fa1cb562 req-f50e813d-c3cc-4e1b-8616-d2193d9db6f0 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 31943dad-f83c-4e41-8ca4-baac8a025255] Received event network-vif-unplugged-ec55d7c2-a601-4f1d-9c15-3aad37d2db0c for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 01 14:27:10 compute-0 nova_compute[192698]: 2025-10-01 14:27:10.644 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:27:10 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:27:10.646 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[cab85fac-7aeb-494d-a1ac-30d04ccba602]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:27:10 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:27:10.678 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[c511e1bf-7e6d-4191-aec3-6ae4aae2675c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:27:10 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:27:10.680 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[a9e19fb4-725d-4c09-a057-53217ae51fcf]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:27:10 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:27:10.701 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[f21da4e9-d950-4eb9-acb7-02171b957468]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 513632, 'reachable_time': 17196, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225641, 'error': None, 'target': 'ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:27:10 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:27:10.705 103910 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Oct 01 14:27:10 compute-0 systemd[1]: run-netns-ovnmeta\x2d031a8987\x2d8430\x2d4fb6\x2da464\x2d01e4dca2fae7.mount: Deactivated successfully.
Oct 01 14:27:10 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:27:10.706 103910 DEBUG oslo.privsep.daemon [-] privsep: reply[6033f22d-440e-42fd-aeed-4a7b748ff6f0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:27:11 compute-0 nova_compute[192698]: 2025-10-01 14:27:11.109 2 INFO nova.compute.manager [None req-5d21a2d3-786f-4052-a1f7-9c24591302ed f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 31943dad-f83c-4e41-8ca4-baac8a025255] Took 1.34 seconds to destroy the instance on the hypervisor.
Oct 01 14:27:11 compute-0 nova_compute[192698]: 2025-10-01 14:27:11.110 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-5d21a2d3-786f-4052-a1f7-9c24591302ed f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 01 14:27:11 compute-0 nova_compute[192698]: 2025-10-01 14:27:11.110 2 DEBUG nova.compute.manager [-] [instance: 31943dad-f83c-4e41-8ca4-baac8a025255] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 01 14:27:11 compute-0 nova_compute[192698]: 2025-10-01 14:27:11.110 2 DEBUG nova.network.neutron [-] [instance: 31943dad-f83c-4e41-8ca4-baac8a025255] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 01 14:27:11 compute-0 nova_compute[192698]: 2025-10-01 14:27:11.111 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:27:11 compute-0 nova_compute[192698]: 2025-10-01 14:27:11.457 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:27:11 compute-0 nova_compute[192698]: 2025-10-01 14:27:11.790 2 DEBUG nova.compute.manager [req-d9401d34-838f-4b92-b07a-881490578065 req-b45f1166-ae87-4652-bfda-4375e4a7dd0f 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 31943dad-f83c-4e41-8ca4-baac8a025255] Received event network-vif-deleted-ec55d7c2-a601-4f1d-9c15-3aad37d2db0c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:27:11 compute-0 nova_compute[192698]: 2025-10-01 14:27:11.791 2 INFO nova.compute.manager [req-d9401d34-838f-4b92-b07a-881490578065 req-b45f1166-ae87-4652-bfda-4375e4a7dd0f 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 31943dad-f83c-4e41-8ca4-baac8a025255] Neutron deleted interface ec55d7c2-a601-4f1d-9c15-3aad37d2db0c; detaching it from the instance and deleting it from the info cache
Oct 01 14:27:11 compute-0 nova_compute[192698]: 2025-10-01 14:27:11.791 2 DEBUG nova.network.neutron [req-d9401d34-838f-4b92-b07a-881490578065 req-b45f1166-ae87-4652-bfda-4375e4a7dd0f 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 31943dad-f83c-4e41-8ca4-baac8a025255] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:27:12 compute-0 nova_compute[192698]: 2025-10-01 14:27:12.229 2 DEBUG nova.network.neutron [-] [instance: 31943dad-f83c-4e41-8ca4-baac8a025255] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:27:12 compute-0 nova_compute[192698]: 2025-10-01 14:27:12.303 2 DEBUG nova.compute.manager [req-d9401d34-838f-4b92-b07a-881490578065 req-b45f1166-ae87-4652-bfda-4375e4a7dd0f 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 31943dad-f83c-4e41-8ca4-baac8a025255] Detach interface failed, port_id=ec55d7c2-a601-4f1d-9c15-3aad37d2db0c, reason: Instance 31943dad-f83c-4e41-8ca4-baac8a025255 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Oct 01 14:27:12 compute-0 nova_compute[192698]: 2025-10-01 14:27:12.726 2 DEBUG nova.compute.manager [req-3b2f077d-af48-455b-b2f4-ee95892be286 req-0ff7819c-15d1-4c64-b9ce-372e6ab1426f 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 31943dad-f83c-4e41-8ca4-baac8a025255] Received event network-vif-unplugged-ec55d7c2-a601-4f1d-9c15-3aad37d2db0c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:27:12 compute-0 nova_compute[192698]: 2025-10-01 14:27:12.727 2 DEBUG oslo_concurrency.lockutils [req-3b2f077d-af48-455b-b2f4-ee95892be286 req-0ff7819c-15d1-4c64-b9ce-372e6ab1426f 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "31943dad-f83c-4e41-8ca4-baac8a025255-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:27:12 compute-0 nova_compute[192698]: 2025-10-01 14:27:12.727 2 DEBUG oslo_concurrency.lockutils [req-3b2f077d-af48-455b-b2f4-ee95892be286 req-0ff7819c-15d1-4c64-b9ce-372e6ab1426f 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "31943dad-f83c-4e41-8ca4-baac8a025255-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:27:12 compute-0 nova_compute[192698]: 2025-10-01 14:27:12.727 2 DEBUG oslo_concurrency.lockutils [req-3b2f077d-af48-455b-b2f4-ee95892be286 req-0ff7819c-15d1-4c64-b9ce-372e6ab1426f 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "31943dad-f83c-4e41-8ca4-baac8a025255-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:27:12 compute-0 nova_compute[192698]: 2025-10-01 14:27:12.727 2 DEBUG nova.compute.manager [req-3b2f077d-af48-455b-b2f4-ee95892be286 req-0ff7819c-15d1-4c64-b9ce-372e6ab1426f 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 31943dad-f83c-4e41-8ca4-baac8a025255] No waiting events found dispatching network-vif-unplugged-ec55d7c2-a601-4f1d-9c15-3aad37d2db0c pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:27:12 compute-0 nova_compute[192698]: 2025-10-01 14:27:12.728 2 DEBUG nova.compute.manager [req-3b2f077d-af48-455b-b2f4-ee95892be286 req-0ff7819c-15d1-4c64-b9ce-372e6ab1426f 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 31943dad-f83c-4e41-8ca4-baac8a025255] Received event network-vif-unplugged-ec55d7c2-a601-4f1d-9c15-3aad37d2db0c for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 01 14:27:12 compute-0 nova_compute[192698]: 2025-10-01 14:27:12.739 2 INFO nova.compute.manager [-] [instance: 31943dad-f83c-4e41-8ca4-baac8a025255] Took 1.63 seconds to deallocate network for instance.
Oct 01 14:27:13 compute-0 podman[225642]: 2025-10-01 14:27:13.18454146 +0000 UTC m=+0.100667380 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 01 14:27:13 compute-0 nova_compute[192698]: 2025-10-01 14:27:13.269 2 DEBUG oslo_concurrency.lockutils [None req-5d21a2d3-786f-4052-a1f7-9c24591302ed f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:27:13 compute-0 nova_compute[192698]: 2025-10-01 14:27:13.270 2 DEBUG oslo_concurrency.lockutils [None req-5d21a2d3-786f-4052-a1f7-9c24591302ed f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:27:13 compute-0 nova_compute[192698]: 2025-10-01 14:27:13.322 2 DEBUG nova.compute.provider_tree [None req-5d21a2d3-786f-4052-a1f7-9c24591302ed f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:27:13 compute-0 nova_compute[192698]: 2025-10-01 14:27:13.830 2 DEBUG nova.scheduler.client.report [None req-5d21a2d3-786f-4052-a1f7-9c24591302ed f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:27:13 compute-0 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct 01 14:27:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:27:14.288 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:27:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:27:14.289 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:27:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:27:14.289 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:27:14 compute-0 nova_compute[192698]: 2025-10-01 14:27:14.341 2 DEBUG oslo_concurrency.lockutils [None req-5d21a2d3-786f-4052-a1f7-9c24591302ed f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.071s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:27:14 compute-0 nova_compute[192698]: 2025-10-01 14:27:14.366 2 INFO nova.scheduler.client.report [None req-5d21a2d3-786f-4052-a1f7-9c24591302ed f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Deleted allocations for instance 31943dad-f83c-4e41-8ca4-baac8a025255
Oct 01 14:27:14 compute-0 nova_compute[192698]: 2025-10-01 14:27:14.444 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:27:15 compute-0 nova_compute[192698]: 2025-10-01 14:27:15.400 2 DEBUG oslo_concurrency.lockutils [None req-5d21a2d3-786f-4052-a1f7-9c24591302ed f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "31943dad-f83c-4e41-8ca4-baac8a025255" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.289s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:27:15 compute-0 nova_compute[192698]: 2025-10-01 14:27:15.575 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:27:19 compute-0 nova_compute[192698]: 2025-10-01 14:27:19.448 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:27:20 compute-0 nova_compute[192698]: 2025-10-01 14:27:20.578 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:27:20 compute-0 unix_chkpwd[225668]: password check failed for user (root)
Oct 01 14:27:20 compute-0 sshd-session[225513]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=101.47.181.100  user=root
Oct 01 14:27:22 compute-0 sshd-session[225513]: Failed password for root from 101.47.181.100 port 52000 ssh2
Oct 01 14:27:23 compute-0 sshd-session[225513]: Connection closed by authenticating user root 101.47.181.100 port 52000 [preauth]
Oct 01 14:27:24 compute-0 podman[225669]: 2025-10-01 14:27:24.1619284 +0000 UTC m=+0.066313875 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent)
Oct 01 14:27:24 compute-0 podman[225670]: 2025-10-01 14:27:24.214739041 +0000 UTC m=+0.123739410 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Oct 01 14:27:24 compute-0 nova_compute[192698]: 2025-10-01 14:27:24.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:27:25 compute-0 nova_compute[192698]: 2025-10-01 14:27:25.582 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:27:25 compute-0 nova_compute[192698]: 2025-10-01 14:27:25.926 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:27:27 compute-0 nova_compute[192698]: 2025-10-01 14:27:27.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:27:27 compute-0 nova_compute[192698]: 2025-10-01 14:27:27.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:27:28 compute-0 nova_compute[192698]: 2025-10-01 14:27:28.466 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:27:28 compute-0 nova_compute[192698]: 2025-10-01 14:27:28.467 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:27:28 compute-0 nova_compute[192698]: 2025-10-01 14:27:28.467 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:27:28 compute-0 nova_compute[192698]: 2025-10-01 14:27:28.467 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 01 14:27:28 compute-0 nova_compute[192698]: 2025-10-01 14:27:28.638 2 WARNING nova.virt.libvirt.driver [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:27:28 compute-0 nova_compute[192698]: 2025-10-01 14:27:28.641 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:27:28 compute-0 nova_compute[192698]: 2025-10-01 14:27:28.665 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.024s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:27:28 compute-0 nova_compute[192698]: 2025-10-01 14:27:28.665 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5837MB free_disk=73.30263900756836GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 01 14:27:28 compute-0 nova_compute[192698]: 2025-10-01 14:27:28.666 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:27:28 compute-0 nova_compute[192698]: 2025-10-01 14:27:28.666 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:27:29 compute-0 unix_chkpwd[225719]: password check failed for user (root)
Oct 01 14:27:29 compute-0 sshd-session[225715]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=101.47.181.100  user=root
Oct 01 14:27:29 compute-0 nova_compute[192698]: 2025-10-01 14:27:29.497 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:27:29 compute-0 podman[203144]: time="2025-10-01T14:27:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:27:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:27:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:27:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:27:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3029 "" "Go-http-client/1.1"
Oct 01 14:27:29 compute-0 nova_compute[192698]: 2025-10-01 14:27:29.969 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 01 14:27:29 compute-0 nova_compute[192698]: 2025-10-01 14:27:29.970 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:27:28 up  1:26,  0 user,  load average: 0.15, 0.24, 0.30\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 01 14:27:29 compute-0 nova_compute[192698]: 2025-10-01 14:27:29.992 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:27:30 compute-0 nova_compute[192698]: 2025-10-01 14:27:30.504 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:27:30 compute-0 nova_compute[192698]: 2025-10-01 14:27:30.585 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:27:31 compute-0 nova_compute[192698]: 2025-10-01 14:27:31.019 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 01 14:27:31 compute-0 nova_compute[192698]: 2025-10-01 14:27:31.020 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.354s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:27:31 compute-0 podman[225720]: 2025-10-01 14:27:31.208373618 +0000 UTC m=+0.113455394 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, release=1755695350, vcs-type=git, config_id=edpm, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct 01 14:27:31 compute-0 sshd-session[225715]: Failed password for root from 101.47.181.100 port 48274 ssh2
Oct 01 14:27:31 compute-0 openstack_network_exporter[205307]: ERROR   14:27:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:27:31 compute-0 openstack_network_exporter[205307]: ERROR   14:27:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:27:31 compute-0 openstack_network_exporter[205307]: ERROR   14:27:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:27:31 compute-0 openstack_network_exporter[205307]: ERROR   14:27:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:27:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:27:31 compute-0 openstack_network_exporter[205307]: ERROR   14:27:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:27:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:27:32 compute-0 sshd-session[225715]: Connection closed by authenticating user root 101.47.181.100 port 48274 [preauth]
Oct 01 14:27:33 compute-0 nova_compute[192698]: 2025-10-01 14:27:33.020 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:27:33 compute-0 nova_compute[192698]: 2025-10-01 14:27:33.544 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:27:33 compute-0 nova_compute[192698]: 2025-10-01 14:27:33.545 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:27:34 compute-0 nova_compute[192698]: 2025-10-01 14:27:34.499 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:27:34 compute-0 nova_compute[192698]: 2025-10-01 14:27:34.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:27:34 compute-0 nova_compute[192698]: 2025-10-01 14:27:34.926 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:27:34 compute-0 nova_compute[192698]: 2025-10-01 14:27:34.927 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 01 14:27:35 compute-0 nova_compute[192698]: 2025-10-01 14:27:35.588 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:27:35 compute-0 nova_compute[192698]: 2025-10-01 14:27:35.926 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:27:38 compute-0 podman[225744]: 2025-10-01 14:27:38.16209959 +0000 UTC m=+0.077409283 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, config_id=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 01 14:27:38 compute-0 podman[225745]: 2025-10-01 14:27:38.16913974 +0000 UTC m=+0.074083174 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct 01 14:27:39 compute-0 nova_compute[192698]: 2025-10-01 14:27:39.554 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:27:40 compute-0 nova_compute[192698]: 2025-10-01 14:27:40.634 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:27:44 compute-0 podman[225784]: 2025-10-01 14:27:44.149959971 +0000 UTC m=+0.065622266 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 01 14:27:44 compute-0 nova_compute[192698]: 2025-10-01 14:27:44.556 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:27:45 compute-0 nova_compute[192698]: 2025-10-01 14:27:45.694 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:27:46 compute-0 unix_chkpwd[225808]: password check failed for user (root)
Oct 01 14:27:46 compute-0 sshd-session[225742]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=101.47.181.100  user=root
Oct 01 14:27:47 compute-0 sshd-session[225742]: Failed password for root from 101.47.181.100 port 37086 ssh2
Oct 01 14:27:49 compute-0 sshd-session[225742]: Connection closed by authenticating user root 101.47.181.100 port 37086 [preauth]
Oct 01 14:27:49 compute-0 nova_compute[192698]: 2025-10-01 14:27:49.559 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:27:50 compute-0 nova_compute[192698]: 2025-10-01 14:27:50.696 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:27:53 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:27:53.898 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=26, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'e2:3f:3c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:1d:a6:67:ed:e6'}, ipsec=False) old=SB_Global(nb_cfg=25) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:27:53 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:27:53.900 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 01 14:27:53 compute-0 nova_compute[192698]: 2025-10-01 14:27:53.900 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:27:54 compute-0 nova_compute[192698]: 2025-10-01 14:27:54.563 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:27:55 compute-0 podman[225811]: 2025-10-01 14:27:55.167646484 +0000 UTC m=+0.076819978 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20250930, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 01 14:27:55 compute-0 podman[225812]: 2025-10-01 14:27:55.231229515 +0000 UTC m=+0.138647721 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Oct 01 14:27:55 compute-0 nova_compute[192698]: 2025-10-01 14:27:55.729 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:27:56 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:27:56.902 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=10cf9814-09fa-4bad-879a-270f9b64eda3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '26'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:27:59 compute-0 nova_compute[192698]: 2025-10-01 14:27:59.564 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:27:59 compute-0 podman[203144]: time="2025-10-01T14:27:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:27:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:27:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:27:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:27:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3027 "" "Go-http-client/1.1"
Oct 01 14:28:00 compute-0 nova_compute[192698]: 2025-10-01 14:28:00.733 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:28:01 compute-0 openstack_network_exporter[205307]: ERROR   14:28:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:28:01 compute-0 openstack_network_exporter[205307]: ERROR   14:28:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:28:01 compute-0 openstack_network_exporter[205307]: ERROR   14:28:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:28:01 compute-0 openstack_network_exporter[205307]: ERROR   14:28:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:28:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:28:01 compute-0 openstack_network_exporter[205307]: ERROR   14:28:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:28:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:28:02 compute-0 podman[225858]: 2025-10-01 14:28:02.161407984 +0000 UTC m=+0.076040466 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, version=9.6, distribution-scope=public, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, vcs-type=git, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container)
Oct 01 14:28:03 compute-0 unix_chkpwd[225880]: password check failed for user (root)
Oct 01 14:28:03 compute-0 sshd-session[225809]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=101.47.181.100  user=root
Oct 01 14:28:04 compute-0 nova_compute[192698]: 2025-10-01 14:28:04.567 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:28:04 compute-0 sshd-session[225809]: Failed password for root from 101.47.181.100 port 33486 ssh2
Oct 01 14:28:05 compute-0 nova_compute[192698]: 2025-10-01 14:28:05.735 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:28:06 compute-0 sshd-session[225809]: Connection closed by authenticating user root 101.47.181.100 port 33486 [preauth]
Oct 01 14:28:09 compute-0 podman[225883]: 2025-10-01 14:28:09.152333888 +0000 UTC m=+0.073674873 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Oct 01 14:28:09 compute-0 podman[225882]: 2025-10-01 14:28:09.164132375 +0000 UTC m=+0.079237162 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=iscsid, org.label-schema.build-date=20250930)
Oct 01 14:28:09 compute-0 nova_compute[192698]: 2025-10-01 14:28:09.568 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:28:10 compute-0 nova_compute[192698]: 2025-10-01 14:28:10.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:28:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:28:14.290 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:28:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:28:14.291 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:28:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:28:14.291 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:28:14 compute-0 nova_compute[192698]: 2025-10-01 14:28:14.572 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:28:15 compute-0 podman[225924]: 2025-10-01 14:28:15.172420354 +0000 UTC m=+0.082097340 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 01 14:28:15 compute-0 unix_chkpwd[225948]: password check failed for user (root)
Oct 01 14:28:15 compute-0 sshd-session[225881]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=101.47.181.100  user=root
Oct 01 14:28:15 compute-0 nova_compute[192698]: 2025-10-01 14:28:15.741 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:28:17 compute-0 sshd-session[225881]: Failed password for root from 101.47.181.100 port 60708 ssh2
Oct 01 14:28:19 compute-0 nova_compute[192698]: 2025-10-01 14:28:19.574 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:28:20 compute-0 nova_compute[192698]: 2025-10-01 14:28:20.743 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:28:22 compute-0 sshd-session[225881]: Connection closed by authenticating user root 101.47.181.100 port 60708 [preauth]
Oct 01 14:28:23 compute-0 unix_chkpwd[225951]: password check failed for user (root)
Oct 01 14:28:23 compute-0 sshd-session[225949]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=101.47.181.100  user=root
Oct 01 14:28:24 compute-0 nova_compute[192698]: 2025-10-01 14:28:24.577 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:28:25 compute-0 nova_compute[192698]: 2025-10-01 14:28:25.744 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:28:25 compute-0 nova_compute[192698]: 2025-10-01 14:28:25.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:28:26 compute-0 sshd-session[225949]: Failed password for root from 101.47.181.100 port 54512 ssh2
Oct 01 14:28:26 compute-0 podman[225952]: 2025-10-01 14:28:26.189192374 +0000 UTC m=+0.093190058 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct 01 14:28:26 compute-0 podman[225953]: 2025-10-01 14:28:26.213764405 +0000 UTC m=+0.115189070 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 01 14:28:27 compute-0 sshd-session[225949]: Connection closed by authenticating user root 101.47.181.100 port 54512 [preauth]
Oct 01 14:28:27 compute-0 nova_compute[192698]: 2025-10-01 14:28:27.927 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:28:27 compute-0 nova_compute[192698]: 2025-10-01 14:28:27.927 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Oct 01 14:28:29 compute-0 nova_compute[192698]: 2025-10-01 14:28:29.432 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:28:29 compute-0 nova_compute[192698]: 2025-10-01 14:28:29.433 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:28:29 compute-0 nova_compute[192698]: 2025-10-01 14:28:29.613 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:28:29 compute-0 podman[203144]: time="2025-10-01T14:28:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:28:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:28:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:28:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:28:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3029 "" "Go-http-client/1.1"
Oct 01 14:28:29 compute-0 nova_compute[192698]: 2025-10-01 14:28:29.951 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:28:29 compute-0 nova_compute[192698]: 2025-10-01 14:28:29.952 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:28:29 compute-0 nova_compute[192698]: 2025-10-01 14:28:29.952 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:28:29 compute-0 nova_compute[192698]: 2025-10-01 14:28:29.952 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 01 14:28:30 compute-0 nova_compute[192698]: 2025-10-01 14:28:30.202 2 WARNING nova.virt.libvirt.driver [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:28:30 compute-0 nova_compute[192698]: 2025-10-01 14:28:30.205 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:28:30 compute-0 nova_compute[192698]: 2025-10-01 14:28:30.238 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.033s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:28:30 compute-0 nova_compute[192698]: 2025-10-01 14:28:30.239 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5853MB free_disk=73.3026237487793GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 01 14:28:30 compute-0 nova_compute[192698]: 2025-10-01 14:28:30.240 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:28:30 compute-0 nova_compute[192698]: 2025-10-01 14:28:30.240 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:28:30 compute-0 nova_compute[192698]: 2025-10-01 14:28:30.746 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:28:31 compute-0 openstack_network_exporter[205307]: ERROR   14:28:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:28:31 compute-0 openstack_network_exporter[205307]: ERROR   14:28:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:28:31 compute-0 openstack_network_exporter[205307]: ERROR   14:28:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:28:31 compute-0 openstack_network_exporter[205307]: ERROR   14:28:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:28:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:28:31 compute-0 openstack_network_exporter[205307]: ERROR   14:28:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:28:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:28:31 compute-0 nova_compute[192698]: 2025-10-01 14:28:31.871 2 WARNING nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Instance 247a480d-9e4d-4832-9646-d263b8a3035e has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Oct 01 14:28:32 compute-0 nova_compute[192698]: 2025-10-01 14:28:32.378 2 WARNING nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Instance 8668fad5-f310-4dec-9960-6e26c28db100 has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Oct 01 14:28:32 compute-0 nova_compute[192698]: 2025-10-01 14:28:32.380 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 01 14:28:32 compute-0 nova_compute[192698]: 2025-10-01 14:28:32.380 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:28:30 up  1:27,  0 user,  load average: 0.09, 0.21, 0.28\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 01 14:28:32 compute-0 nova_compute[192698]: 2025-10-01 14:28:32.413 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Refreshing inventories for resource provider ee1e54f5-453b-4949-a499-9a192f03b8f0 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Oct 01 14:28:32 compute-0 nova_compute[192698]: 2025-10-01 14:28:32.430 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Updating ProviderTree inventory for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Oct 01 14:28:32 compute-0 nova_compute[192698]: 2025-10-01 14:28:32.431 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Updating inventory in ProviderTree for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Oct 01 14:28:32 compute-0 nova_compute[192698]: 2025-10-01 14:28:32.456 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Refreshing aggregate associations for resource provider ee1e54f5-453b-4949-a499-9a192f03b8f0, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Oct 01 14:28:32 compute-0 nova_compute[192698]: 2025-10-01 14:28:32.483 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Refreshing trait associations for resource provider ee1e54f5-453b-4949-a499-9a192f03b8f0, traits: COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_TPM_TIS,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_BMI2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_AVX2,COMPUTE_ARCH_X86_64,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SHA,COMPUTE_SOUND_MODEL_AC97,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_SOUND_MODEL_ES1370,HW_ARCH_X86_64,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_DEVICE_TAGGING,COMPUTE_SECURITY_TPM_CRB,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SOUND_MODEL_SB16,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SOUND_MODEL_USB,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_ACCELERATORS,HW_CPU_X86_CLMUL,HW_CPU_X86_AESNI,COMPUTE_NODE,HW_CPU_X86_SSSE3,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_RESCUE_BFV,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_FMA3,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_AVX,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_F16C,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_ABM,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_VIRTIO_FS,HW_CPU_X86_SSE2,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE4A,HW_CPU_X86_SVM _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Oct 01 14:28:32 compute-0 nova_compute[192698]: 2025-10-01 14:28:32.542 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:28:32 compute-0 ovn_controller[94909]: 2025-10-01T14:28:32Z|00209|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Oct 01 14:28:32 compute-0 nova_compute[192698]: 2025-10-01 14:28:32.790 2 DEBUG nova.virt.libvirt.driver [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 8668fad5-f310-4dec-9960-6e26c28db100] Creating tmpfile /var/lib/nova/instances/tmpubdhy59o to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Oct 01 14:28:32 compute-0 nova_compute[192698]: 2025-10-01 14:28:32.791 2 WARNING neutronclient.v2_0.client [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:28:32 compute-0 nova_compute[192698]: 2025-10-01 14:28:32.792 2 DEBUG nova.virt.libvirt.driver [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 247a480d-9e4d-4832-9646-d263b8a3035e] Creating tmpfile /var/lib/nova/instances/tmpyj7glsxk to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Oct 01 14:28:32 compute-0 nova_compute[192698]: 2025-10-01 14:28:32.793 2 WARNING neutronclient.v2_0.client [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:28:32 compute-0 nova_compute[192698]: 2025-10-01 14:28:32.796 2 DEBUG nova.compute.manager [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpubdhy59o',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Oct 01 14:28:32 compute-0 nova_compute[192698]: 2025-10-01 14:28:32.799 2 DEBUG nova.compute.manager [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpyj7glsxk',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Oct 01 14:28:33 compute-0 nova_compute[192698]: 2025-10-01 14:28:33.051 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:28:33 compute-0 podman[226000]: 2025-10-01 14:28:33.196024066 +0000 UTC m=+0.096686132 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., architecture=x86_64, config_id=edpm, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, distribution-scope=public, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc.)
Oct 01 14:28:33 compute-0 nova_compute[192698]: 2025-10-01 14:28:33.561 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 01 14:28:33 compute-0 nova_compute[192698]: 2025-10-01 14:28:33.562 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.322s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:28:33 compute-0 nova_compute[192698]: 2025-10-01 14:28:33.562 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:28:34 compute-0 nova_compute[192698]: 2025-10-01 14:28:34.561 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:28:34 compute-0 nova_compute[192698]: 2025-10-01 14:28:34.562 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:28:34 compute-0 nova_compute[192698]: 2025-10-01 14:28:34.617 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:28:34 compute-0 nova_compute[192698]: 2025-10-01 14:28:34.820 2 WARNING neutronclient.v2_0.client [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:28:34 compute-0 nova_compute[192698]: 2025-10-01 14:28:34.835 2 WARNING neutronclient.v2_0.client [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:28:35 compute-0 unix_chkpwd[226021]: password check failed for user (root)
Oct 01 14:28:35 compute-0 sshd-session[225997]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=101.47.181.100  user=root
Oct 01 14:28:35 compute-0 nova_compute[192698]: 2025-10-01 14:28:35.751 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:28:36 compute-0 nova_compute[192698]: 2025-10-01 14:28:36.914 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:28:36 compute-0 nova_compute[192698]: 2025-10-01 14:28:36.924 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:28:36 compute-0 nova_compute[192698]: 2025-10-01 14:28:36.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:28:36 compute-0 nova_compute[192698]: 2025-10-01 14:28:36.925 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 01 14:28:37 compute-0 sshd-session[225997]: Failed password for root from 101.47.181.100 port 48078 ssh2
Oct 01 14:28:38 compute-0 sshd-session[225997]: Connection closed by authenticating user root 101.47.181.100 port 48078 [preauth]
Oct 01 14:28:39 compute-0 nova_compute[192698]: 2025-10-01 14:28:39.590 2 DEBUG nova.compute.manager [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpubdhy59o',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='8668fad5-f310-4dec-9960-6e26c28db100',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Oct 01 14:28:39 compute-0 nova_compute[192698]: 2025-10-01 14:28:39.666 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:28:40 compute-0 podman[226024]: 2025-10-01 14:28:40.151548188 +0000 UTC m=+0.064623740 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_managed=true, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 01 14:28:40 compute-0 podman[226025]: 2025-10-01 14:28:40.161811574 +0000 UTC m=+0.068455173 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct 01 14:28:40 compute-0 unix_chkpwd[226065]: password check failed for user (root)
Oct 01 14:28:40 compute-0 sshd-session[226022]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=101.47.181.100  user=root
Oct 01 14:28:40 compute-0 nova_compute[192698]: 2025-10-01 14:28:40.607 2 DEBUG oslo_concurrency.lockutils [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "refresh_cache-8668fad5-f310-4dec-9960-6e26c28db100" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 01 14:28:40 compute-0 nova_compute[192698]: 2025-10-01 14:28:40.608 2 DEBUG oslo_concurrency.lockutils [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquired lock "refresh_cache-8668fad5-f310-4dec-9960-6e26c28db100" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 01 14:28:40 compute-0 nova_compute[192698]: 2025-10-01 14:28:40.608 2 DEBUG nova.network.neutron [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 8668fad5-f310-4dec-9960-6e26c28db100] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 01 14:28:40 compute-0 nova_compute[192698]: 2025-10-01 14:28:40.791 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:28:41 compute-0 nova_compute[192698]: 2025-10-01 14:28:41.116 2 WARNING neutronclient.v2_0.client [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:28:41 compute-0 nova_compute[192698]: 2025-10-01 14:28:41.846 2 WARNING neutronclient.v2_0.client [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:28:41 compute-0 nova_compute[192698]: 2025-10-01 14:28:41.991 2 DEBUG nova.network.neutron [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 8668fad5-f310-4dec-9960-6e26c28db100] Updating instance_info_cache with network_info: [{"id": "e6b4fe04-03d3-4ff9-b41e-73373dfb2f25", "address": "fa:16:3e:4a:89:04", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6b4fe04-03", "ovs_interfaceid": "e6b4fe04-03d3-4ff9-b41e-73373dfb2f25", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:28:42 compute-0 sshd-session[226022]: Failed password for root from 101.47.181.100 port 43676 ssh2
Oct 01 14:28:42 compute-0 nova_compute[192698]: 2025-10-01 14:28:42.500 2 DEBUG oslo_concurrency.lockutils [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Releasing lock "refresh_cache-8668fad5-f310-4dec-9960-6e26c28db100" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 01 14:28:42 compute-0 nova_compute[192698]: 2025-10-01 14:28:42.519 2 DEBUG nova.virt.libvirt.driver [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 8668fad5-f310-4dec-9960-6e26c28db100] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpubdhy59o',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='8668fad5-f310-4dec-9960-6e26c28db100',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Oct 01 14:28:42 compute-0 nova_compute[192698]: 2025-10-01 14:28:42.519 2 DEBUG nova.virt.libvirt.driver [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 8668fad5-f310-4dec-9960-6e26c28db100] Creating instance directory: /var/lib/nova/instances/8668fad5-f310-4dec-9960-6e26c28db100 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Oct 01 14:28:42 compute-0 nova_compute[192698]: 2025-10-01 14:28:42.520 2 DEBUG nova.virt.libvirt.driver [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 8668fad5-f310-4dec-9960-6e26c28db100] Creating disk.info with the contents: {'/var/lib/nova/instances/8668fad5-f310-4dec-9960-6e26c28db100/disk': 'qcow2', '/var/lib/nova/instances/8668fad5-f310-4dec-9960-6e26c28db100/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Oct 01 14:28:42 compute-0 nova_compute[192698]: 2025-10-01 14:28:42.520 2 DEBUG nova.virt.libvirt.driver [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 8668fad5-f310-4dec-9960-6e26c28db100] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Oct 01 14:28:42 compute-0 nova_compute[192698]: 2025-10-01 14:28:42.521 2 DEBUG nova.objects.instance [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 8668fad5-f310-4dec-9960-6e26c28db100 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:28:43 compute-0 nova_compute[192698]: 2025-10-01 14:28:43.027 2 DEBUG oslo_utils.imageutils.format_inspector [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:28:43 compute-0 nova_compute[192698]: 2025-10-01 14:28:43.031 2 DEBUG oslo_utils.imageutils.format_inspector [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:28:43 compute-0 nova_compute[192698]: 2025-10-01 14:28:43.033 2 DEBUG oslo_concurrency.processutils [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:28:43 compute-0 nova_compute[192698]: 2025-10-01 14:28:43.089 2 DEBUG oslo_concurrency.processutils [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:28:43 compute-0 nova_compute[192698]: 2025-10-01 14:28:43.090 2 DEBUG oslo_concurrency.lockutils [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "f477473ce09fdc00484ca839f539813eb2fee546" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:28:43 compute-0 nova_compute[192698]: 2025-10-01 14:28:43.091 2 DEBUG oslo_concurrency.lockutils [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "f477473ce09fdc00484ca839f539813eb2fee546" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:28:43 compute-0 nova_compute[192698]: 2025-10-01 14:28:43.092 2 DEBUG oslo_utils.imageutils.format_inspector [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:28:43 compute-0 nova_compute[192698]: 2025-10-01 14:28:43.096 2 DEBUG oslo_utils.imageutils.format_inspector [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:28:43 compute-0 nova_compute[192698]: 2025-10-01 14:28:43.097 2 DEBUG oslo_concurrency.processutils [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:28:43 compute-0 nova_compute[192698]: 2025-10-01 14:28:43.158 2 DEBUG oslo_concurrency.processutils [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:28:43 compute-0 nova_compute[192698]: 2025-10-01 14:28:43.159 2 DEBUG oslo_concurrency.processutils [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546,backing_fmt=raw /var/lib/nova/instances/8668fad5-f310-4dec-9960-6e26c28db100/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:28:43 compute-0 nova_compute[192698]: 2025-10-01 14:28:43.208 2 DEBUG oslo_concurrency.processutils [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546,backing_fmt=raw /var/lib/nova/instances/8668fad5-f310-4dec-9960-6e26c28db100/disk 1073741824" returned: 0 in 0.048s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:28:43 compute-0 nova_compute[192698]: 2025-10-01 14:28:43.209 2 DEBUG oslo_concurrency.lockutils [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "f477473ce09fdc00484ca839f539813eb2fee546" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.118s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:28:43 compute-0 nova_compute[192698]: 2025-10-01 14:28:43.209 2 DEBUG oslo_concurrency.processutils [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:28:43 compute-0 nova_compute[192698]: 2025-10-01 14:28:43.297 2 DEBUG oslo_concurrency.processutils [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:28:43 compute-0 nova_compute[192698]: 2025-10-01 14:28:43.298 2 DEBUG nova.virt.disk.api [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Checking if we can resize image /var/lib/nova/instances/8668fad5-f310-4dec-9960-6e26c28db100/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 01 14:28:43 compute-0 nova_compute[192698]: 2025-10-01 14:28:43.299 2 DEBUG oslo_concurrency.processutils [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8668fad5-f310-4dec-9960-6e26c28db100/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:28:43 compute-0 nova_compute[192698]: 2025-10-01 14:28:43.354 2 DEBUG oslo_concurrency.processutils [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8668fad5-f310-4dec-9960-6e26c28db100/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:28:43 compute-0 nova_compute[192698]: 2025-10-01 14:28:43.355 2 DEBUG nova.virt.disk.api [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Cannot resize image /var/lib/nova/instances/8668fad5-f310-4dec-9960-6e26c28db100/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 01 14:28:43 compute-0 nova_compute[192698]: 2025-10-01 14:28:43.356 2 DEBUG nova.objects.instance [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lazy-loading 'migration_context' on Instance uuid 8668fad5-f310-4dec-9960-6e26c28db100 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:28:43 compute-0 sshd-session[226022]: Connection closed by authenticating user root 101.47.181.100 port 43676 [preauth]
Oct 01 14:28:43 compute-0 nova_compute[192698]: 2025-10-01 14:28:43.863 2 DEBUG nova.objects.base [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Object Instance<8668fad5-f310-4dec-9960-6e26c28db100> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Oct 01 14:28:43 compute-0 nova_compute[192698]: 2025-10-01 14:28:43.864 2 DEBUG oslo_concurrency.processutils [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/8668fad5-f310-4dec-9960-6e26c28db100/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:28:43 compute-0 nova_compute[192698]: 2025-10-01 14:28:43.908 2 DEBUG oslo_concurrency.processutils [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/8668fad5-f310-4dec-9960-6e26c28db100/disk.config 497664" returned: 0 in 0.044s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:28:43 compute-0 nova_compute[192698]: 2025-10-01 14:28:43.909 2 DEBUG nova.virt.libvirt.driver [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 8668fad5-f310-4dec-9960-6e26c28db100] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Oct 01 14:28:43 compute-0 nova_compute[192698]: 2025-10-01 14:28:43.912 2 DEBUG nova.virt.libvirt.vif [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-10-01T14:27:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1324074072',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1324074072',id=27,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-01T14:28:04Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d43115e3729442e1b68b749acc0dabc8',ramdisk_id='',reservation_id='r-th77tlj1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-30131345',owner_user_name='tempest-TestExecuteStrategies-30131345-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-10-01T14:28:04Z,user_data=None,user_id='f8897741e6ca4770b56d28d05fa3fc42',uuid=8668fad5-f310-4dec-9960-6e26c28db100,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e6b4fe04-03d3-4ff9-b41e-73373dfb2f25", "address": "fa:16:3e:4a:89:04", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tape6b4fe04-03", "ovs_interfaceid": "e6b4fe04-03d3-4ff9-b41e-73373dfb2f25", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 01 14:28:43 compute-0 nova_compute[192698]: 2025-10-01 14:28:43.912 2 DEBUG nova.network.os_vif_util [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Converting VIF {"id": "e6b4fe04-03d3-4ff9-b41e-73373dfb2f25", "address": "fa:16:3e:4a:89:04", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tape6b4fe04-03", "ovs_interfaceid": "e6b4fe04-03d3-4ff9-b41e-73373dfb2f25", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:28:43 compute-0 nova_compute[192698]: 2025-10-01 14:28:43.914 2 DEBUG nova.network.os_vif_util [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4a:89:04,bridge_name='br-int',has_traffic_filtering=True,id=e6b4fe04-03d3-4ff9-b41e-73373dfb2f25,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape6b4fe04-03') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:28:43 compute-0 nova_compute[192698]: 2025-10-01 14:28:43.915 2 DEBUG os_vif [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4a:89:04,bridge_name='br-int',has_traffic_filtering=True,id=e6b4fe04-03d3-4ff9-b41e-73373dfb2f25,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape6b4fe04-03') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 01 14:28:43 compute-0 nova_compute[192698]: 2025-10-01 14:28:43.916 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:28:43 compute-0 nova_compute[192698]: 2025-10-01 14:28:43.917 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:28:43 compute-0 nova_compute[192698]: 2025-10-01 14:28:43.918 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:28:43 compute-0 nova_compute[192698]: 2025-10-01 14:28:43.919 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:28:43 compute-0 nova_compute[192698]: 2025-10-01 14:28:43.920 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '78803887-2ef6-512d-bf0d-37e59a68b786', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:28:43 compute-0 nova_compute[192698]: 2025-10-01 14:28:43.922 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:28:43 compute-0 nova_compute[192698]: 2025-10-01 14:28:43.925 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:28:43 compute-0 nova_compute[192698]: 2025-10-01 14:28:43.929 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:28:43 compute-0 nova_compute[192698]: 2025-10-01 14:28:43.930 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape6b4fe04-03, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:28:43 compute-0 nova_compute[192698]: 2025-10-01 14:28:43.931 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tape6b4fe04-03, col_values=(('qos', UUID('0c3b76d3-6571-4116-9585-4ced4ccd6eb1')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:28:43 compute-0 nova_compute[192698]: 2025-10-01 14:28:43.931 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tape6b4fe04-03, col_values=(('external_ids', {'iface-id': 'e6b4fe04-03d3-4ff9-b41e-73373dfb2f25', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4a:89:04', 'vm-uuid': '8668fad5-f310-4dec-9960-6e26c28db100'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:28:43 compute-0 nova_compute[192698]: 2025-10-01 14:28:43.933 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:28:43 compute-0 NetworkManager[51741]: <info>  [1759328923.9348] manager: (tape6b4fe04-03): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/80)
Oct 01 14:28:43 compute-0 nova_compute[192698]: 2025-10-01 14:28:43.937 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:28:43 compute-0 nova_compute[192698]: 2025-10-01 14:28:43.944 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:28:43 compute-0 nova_compute[192698]: 2025-10-01 14:28:43.945 2 INFO os_vif [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4a:89:04,bridge_name='br-int',has_traffic_filtering=True,id=e6b4fe04-03d3-4ff9-b41e-73373dfb2f25,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape6b4fe04-03')
Oct 01 14:28:43 compute-0 nova_compute[192698]: 2025-10-01 14:28:43.946 2 DEBUG nova.virt.libvirt.driver [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Oct 01 14:28:43 compute-0 nova_compute[192698]: 2025-10-01 14:28:43.947 2 DEBUG nova.compute.manager [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpubdhy59o',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='8668fad5-f310-4dec-9960-6e26c28db100',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Oct 01 14:28:43 compute-0 nova_compute[192698]: 2025-10-01 14:28:43.948 2 WARNING neutronclient.v2_0.client [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:28:44 compute-0 nova_compute[192698]: 2025-10-01 14:28:44.536 2 WARNING neutronclient.v2_0.client [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:28:44 compute-0 nova_compute[192698]: 2025-10-01 14:28:44.668 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:28:45 compute-0 unix_chkpwd[226089]: password check failed for user (root)
Oct 01 14:28:45 compute-0 sshd-session[226082]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=101.47.181.100  user=root
Oct 01 14:28:45 compute-0 nova_compute[192698]: 2025-10-01 14:28:45.446 2 DEBUG nova.network.neutron [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 8668fad5-f310-4dec-9960-6e26c28db100] Port e6b4fe04-03d3-4ff9-b41e-73373dfb2f25 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Oct 01 14:28:45 compute-0 nova_compute[192698]: 2025-10-01 14:28:45.461 2 DEBUG nova.compute.manager [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpubdhy59o',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='8668fad5-f310-4dec-9960-6e26c28db100',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Oct 01 14:28:46 compute-0 podman[226090]: 2025-10-01 14:28:46.173471742 +0000 UTC m=+0.082563042 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 01 14:28:46 compute-0 sshd-session[226082]: Failed password for root from 101.47.181.100 port 43684 ssh2
Oct 01 14:28:48 compute-0 sshd-session[226082]: Connection closed by authenticating user root 101.47.181.100 port 43684 [preauth]
Oct 01 14:28:48 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Oct 01 14:28:48 compute-0 systemd[1]: Starting libvirt proxy daemon...
Oct 01 14:28:48 compute-0 systemd[1]: Started libvirt proxy daemon.
Oct 01 14:28:48 compute-0 kernel: tape6b4fe04-03: entered promiscuous mode
Oct 01 14:28:48 compute-0 NetworkManager[51741]: <info>  [1759328928.7916] manager: (tape6b4fe04-03): new Tun device (/org/freedesktop/NetworkManager/Devices/81)
Oct 01 14:28:48 compute-0 ovn_controller[94909]: 2025-10-01T14:28:48Z|00210|binding|INFO|Claiming lport e6b4fe04-03d3-4ff9-b41e-73373dfb2f25 for this additional chassis.
Oct 01 14:28:48 compute-0 ovn_controller[94909]: 2025-10-01T14:28:48Z|00211|binding|INFO|e6b4fe04-03d3-4ff9-b41e-73373dfb2f25: Claiming fa:16:3e:4a:89:04 10.100.0.7
Oct 01 14:28:48 compute-0 nova_compute[192698]: 2025-10-01 14:28:48.793 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:28:48 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:28:48.817 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4a:89:04 10.100.0.7'], port_security=['fa:16:3e:4a:89:04 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '8668fad5-f310-4dec-9960-6e26c28db100', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-031a8987-8430-4fb6-a464-01e4dca2fae7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd43115e3729442e1b68b749acc0dabc8', 'neutron:revision_number': '10', 'neutron:security_group_ids': '43a3232d-93b1-43af-a9a3-1fde49b4460d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bd1914da-f1b0-4097-9d6b-24a3870871dc, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[], logical_port=e6b4fe04-03d3-4ff9-b41e-73373dfb2f25) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:28:48 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:28:48.818 103791 INFO neutron.agent.ovn.metadata.agent [-] Port e6b4fe04-03d3-4ff9-b41e-73373dfb2f25 in datapath 031a8987-8430-4fb6-a464-01e4dca2fae7 unbound from our chassis
Oct 01 14:28:48 compute-0 ovn_controller[94909]: 2025-10-01T14:28:48Z|00212|binding|INFO|Setting lport e6b4fe04-03d3-4ff9-b41e-73373dfb2f25 ovn-installed in OVS
Oct 01 14:28:48 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:28:48.820 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 031a8987-8430-4fb6-a464-01e4dca2fae7
Oct 01 14:28:48 compute-0 systemd-udevd[226146]: Network interface NamePolicy= disabled on kernel command line.
Oct 01 14:28:48 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:28:48.847 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[15049127-73c5-45b2-a9a3-bb2bd080cb17]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:28:48 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:28:48.848 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap031a8987-81 in ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Oct 01 14:28:48 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:28:48.862 214114 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap031a8987-80 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Oct 01 14:28:48 compute-0 nova_compute[192698]: 2025-10-01 14:28:48.863 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:28:48 compute-0 NetworkManager[51741]: <info>  [1759328928.8644] device (tape6b4fe04-03): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 01 14:28:48 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:28:48.863 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[bd9e233b-b4aa-4a83-9414-7bbfe5ec55e8]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:28:48 compute-0 NetworkManager[51741]: <info>  [1759328928.8659] device (tape6b4fe04-03): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 01 14:28:48 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:28:48.865 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[60cd1891-abf0-474e-b62e-4ea04a5e8cd7]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:28:48 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:28:48.882 103910 DEBUG oslo.privsep.daemon [-] privsep: reply[1add322f-e8aa-4c55-92f8-963f70f70d69]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:28:48 compute-0 systemd-machined[152704]: New machine qemu-20-instance-0000001b.
Oct 01 14:28:48 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:28:48.899 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[4687ba45-5af3-4a0f-9017-7d04dd0a140f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:28:48 compute-0 systemd[1]: Started Virtual Machine qemu-20-instance-0000001b.
Oct 01 14:28:48 compute-0 nova_compute[192698]: 2025-10-01 14:28:48.933 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:28:48 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:28:48.953 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[9537a7af-98ef-4baa-92fa-f2f0d10dd280]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:28:48 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:28:48.960 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[bb70ba3b-f538-4598-9274-b1f69ac29280]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:28:48 compute-0 NetworkManager[51741]: <info>  [1759328928.9618] manager: (tap031a8987-80): new Veth device (/org/freedesktop/NetworkManager/Devices/82)
Oct 01 14:28:49 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:28:49.014 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[f9ec63ee-09a3-4fe2-9900-230edc048401]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:28:49 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:28:49.019 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[c90acc7a-e911-4cd6-a326-e8f8f7519384]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:28:49 compute-0 NetworkManager[51741]: <info>  [1759328929.0528] device (tap031a8987-80): carrier: link connected
Oct 01 14:28:49 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:28:49.060 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[a490457e-ead7-4997-9200-a20c70d05168]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:28:49 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:28:49.086 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[28e226df-b455-4cde-82cc-e2ea2e57367b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap031a8987-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:79:6c:81'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 62], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 528958, 'reachable_time': 33985, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226180, 'error': None, 'target': 'ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:28:49 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:28:49.111 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[fecc4614-adee-47fc-8cc4-788c4e6e6a3d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe79:6c81'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 528958, 'tstamp': 528958}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226181, 'error': None, 'target': 'ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:28:49 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:28:49.144 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[42040009-8bfa-4618-b270-9881ae2173d9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap031a8987-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:79:6c:81'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 62], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 528958, 'reachable_time': 33985, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226182, 'error': None, 'target': 'ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:28:49 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:28:49.192 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[7c5cd61d-525a-4679-9cea-2a42a418e13c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:28:49 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:28:49.284 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[1611a51b-d03c-4691-97e4-216281d33a97]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:28:49 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:28:49.286 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap031a8987-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:28:49 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:28:49.286 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:28:49 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:28:49.286 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap031a8987-80, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:28:49 compute-0 nova_compute[192698]: 2025-10-01 14:28:49.288 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:28:49 compute-0 NetworkManager[51741]: <info>  [1759328929.2900] manager: (tap031a8987-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/83)
Oct 01 14:28:49 compute-0 kernel: tap031a8987-80: entered promiscuous mode
Oct 01 14:28:49 compute-0 nova_compute[192698]: 2025-10-01 14:28:49.291 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:28:49 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:28:49.292 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap031a8987-80, col_values=(('external_ids', {'iface-id': '6dd814dc-cba2-4392-85ef-eadb8c4615f7'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:28:49 compute-0 nova_compute[192698]: 2025-10-01 14:28:49.292 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:28:49 compute-0 ovn_controller[94909]: 2025-10-01T14:28:49Z|00213|binding|INFO|Releasing lport 6dd814dc-cba2-4392-85ef-eadb8c4615f7 from this chassis (sb_readonly=0)
Oct 01 14:28:49 compute-0 nova_compute[192698]: 2025-10-01 14:28:49.315 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:28:49 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:28:49.317 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[761d09d1-6938-4ee6-b867-3cbbb07b5b23]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:28:49 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:28:49.318 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:28:49 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:28:49.318 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:28:49 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:28:49.318 103791 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 031a8987-8430-4fb6-a464-01e4dca2fae7 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Oct 01 14:28:49 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:28:49.318 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:28:49 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:28:49.319 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[01030630-0cdb-4f34-82a1-02fe8327d520]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:28:49 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:28:49.319 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:28:49 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:28:49.320 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[413f8c4f-0d49-4eab-b33f-e27ec588f853]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:28:49 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:28:49.320 103791 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Oct 01 14:28:49 compute-0 ovn_metadata_agent[103777]: global
Oct 01 14:28:49 compute-0 ovn_metadata_agent[103777]:     log         /dev/log local0 debug
Oct 01 14:28:49 compute-0 ovn_metadata_agent[103777]:     log-tag     haproxy-metadata-proxy-031a8987-8430-4fb6-a464-01e4dca2fae7
Oct 01 14:28:49 compute-0 ovn_metadata_agent[103777]:     user        root
Oct 01 14:28:49 compute-0 ovn_metadata_agent[103777]:     group       root
Oct 01 14:28:49 compute-0 ovn_metadata_agent[103777]:     maxconn     1024
Oct 01 14:28:49 compute-0 ovn_metadata_agent[103777]:     pidfile     /var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy
Oct 01 14:28:49 compute-0 ovn_metadata_agent[103777]:     daemon
Oct 01 14:28:49 compute-0 ovn_metadata_agent[103777]: 
Oct 01 14:28:49 compute-0 ovn_metadata_agent[103777]: defaults
Oct 01 14:28:49 compute-0 ovn_metadata_agent[103777]:     log global
Oct 01 14:28:49 compute-0 ovn_metadata_agent[103777]:     mode http
Oct 01 14:28:49 compute-0 ovn_metadata_agent[103777]:     option httplog
Oct 01 14:28:49 compute-0 ovn_metadata_agent[103777]:     option dontlognull
Oct 01 14:28:49 compute-0 ovn_metadata_agent[103777]:     option http-server-close
Oct 01 14:28:49 compute-0 ovn_metadata_agent[103777]:     option forwardfor
Oct 01 14:28:49 compute-0 ovn_metadata_agent[103777]:     retries                 3
Oct 01 14:28:49 compute-0 ovn_metadata_agent[103777]:     timeout http-request    30s
Oct 01 14:28:49 compute-0 ovn_metadata_agent[103777]:     timeout connect         30s
Oct 01 14:28:49 compute-0 ovn_metadata_agent[103777]:     timeout client          32s
Oct 01 14:28:49 compute-0 ovn_metadata_agent[103777]:     timeout server          32s
Oct 01 14:28:49 compute-0 ovn_metadata_agent[103777]:     timeout http-keep-alive 30s
Oct 01 14:28:49 compute-0 ovn_metadata_agent[103777]: 
Oct 01 14:28:49 compute-0 ovn_metadata_agent[103777]: listen listener
Oct 01 14:28:49 compute-0 ovn_metadata_agent[103777]:     bind 169.254.169.254:80
Oct 01 14:28:49 compute-0 ovn_metadata_agent[103777]:     
Oct 01 14:28:49 compute-0 ovn_metadata_agent[103777]:     server metadata /var/lib/neutron/metadata_proxy
Oct 01 14:28:49 compute-0 ovn_metadata_agent[103777]: 
Oct 01 14:28:49 compute-0 ovn_metadata_agent[103777]:     http-request add-header X-OVN-Network-ID 031a8987-8430-4fb6-a464-01e4dca2fae7
Oct 01 14:28:49 compute-0 ovn_metadata_agent[103777]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Oct 01 14:28:49 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:28:49.321 103791 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7', 'env', 'PROCESS_TAG=haproxy-031a8987-8430-4fb6-a464-01e4dca2fae7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/031a8987-8430-4fb6-a464-01e4dca2fae7.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Oct 01 14:28:49 compute-0 nova_compute[192698]: 2025-10-01 14:28:49.670 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:28:49 compute-0 podman[226223]: 2025-10-01 14:28:49.780166881 +0000 UTC m=+0.074684930 container create e2622e4de2c6b6f060b8cd000e7eac4774f131de039b8b6ad1cd9dfec54fdbe7 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2)
Oct 01 14:28:49 compute-0 systemd[1]: Started libpod-conmon-e2622e4de2c6b6f060b8cd000e7eac4774f131de039b8b6ad1cd9dfec54fdbe7.scope.
Oct 01 14:28:49 compute-0 podman[226223]: 2025-10-01 14:28:49.740654728 +0000 UTC m=+0.035172867 image pull 0c139338a67144a0d88e07ef5f38b20d3085af4a1586fd8115d3776c8f9c633c 38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Oct 01 14:28:49 compute-0 systemd[1]: Started libcrun container.
Oct 01 14:28:49 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b0cc31d06f1f31d31eb9057c73e160bf957b728979155ad525ce92c047b7c24b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 01 14:28:49 compute-0 podman[226223]: 2025-10-01 14:28:49.875668121 +0000 UTC m=+0.170186230 container init e2622e4de2c6b6f060b8cd000e7eac4774f131de039b8b6ad1cd9dfec54fdbe7 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 01 14:28:49 compute-0 podman[226223]: 2025-10-01 14:28:49.885932517 +0000 UTC m=+0.180450586 container start e2622e4de2c6b6f060b8cd000e7eac4774f131de039b8b6ad1cd9dfec54fdbe7 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Oct 01 14:28:49 compute-0 neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7[226238]: [NOTICE]   (226242) : New worker (226244) forked
Oct 01 14:28:49 compute-0 neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7[226238]: [NOTICE]   (226242) : Loading success.
Oct 01 14:28:50 compute-0 nova_compute[192698]: 2025-10-01 14:28:50.927 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:28:50 compute-0 nova_compute[192698]: 2025-10-01 14:28:50.927 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Oct 01 14:28:51 compute-0 nova_compute[192698]: 2025-10-01 14:28:51.435 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Oct 01 14:28:52 compute-0 ovn_controller[94909]: 2025-10-01T14:28:52Z|00214|binding|INFO|Claiming lport e6b4fe04-03d3-4ff9-b41e-73373dfb2f25 for this chassis.
Oct 01 14:28:52 compute-0 ovn_controller[94909]: 2025-10-01T14:28:52Z|00215|binding|INFO|e6b4fe04-03d3-4ff9-b41e-73373dfb2f25: Claiming fa:16:3e:4a:89:04 10.100.0.7
Oct 01 14:28:52 compute-0 ovn_controller[94909]: 2025-10-01T14:28:52Z|00216|binding|INFO|Setting lport e6b4fe04-03d3-4ff9-b41e-73373dfb2f25 up in Southbound
Oct 01 14:28:52 compute-0 unix_chkpwd[226268]: password check failed for user (root)
Oct 01 14:28:52 compute-0 sshd-session[226200]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=101.47.181.100  user=root
Oct 01 14:28:53 compute-0 nova_compute[192698]: 2025-10-01 14:28:53.490 2 INFO nova.compute.manager [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 8668fad5-f310-4dec-9960-6e26c28db100] Post operation of migration started
Oct 01 14:28:53 compute-0 nova_compute[192698]: 2025-10-01 14:28:53.491 2 WARNING neutronclient.v2_0.client [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:28:53 compute-0 nova_compute[192698]: 2025-10-01 14:28:53.632 2 WARNING neutronclient.v2_0.client [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:28:53 compute-0 nova_compute[192698]: 2025-10-01 14:28:53.633 2 WARNING neutronclient.v2_0.client [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:28:53 compute-0 nova_compute[192698]: 2025-10-01 14:28:53.936 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:28:54 compute-0 sshd-session[226200]: Failed password for root from 101.47.181.100 port 54450 ssh2
Oct 01 14:28:54 compute-0 nova_compute[192698]: 2025-10-01 14:28:54.509 2 DEBUG oslo_concurrency.lockutils [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "refresh_cache-8668fad5-f310-4dec-9960-6e26c28db100" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 01 14:28:54 compute-0 nova_compute[192698]: 2025-10-01 14:28:54.510 2 DEBUG oslo_concurrency.lockutils [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquired lock "refresh_cache-8668fad5-f310-4dec-9960-6e26c28db100" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 01 14:28:54 compute-0 nova_compute[192698]: 2025-10-01 14:28:54.511 2 DEBUG nova.network.neutron [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 8668fad5-f310-4dec-9960-6e26c28db100] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 01 14:28:54 compute-0 nova_compute[192698]: 2025-10-01 14:28:54.674 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:28:55 compute-0 nova_compute[192698]: 2025-10-01 14:28:55.039 2 WARNING neutronclient.v2_0.client [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:28:55 compute-0 nova_compute[192698]: 2025-10-01 14:28:55.665 2 WARNING neutronclient.v2_0.client [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:28:55 compute-0 nova_compute[192698]: 2025-10-01 14:28:55.912 2 DEBUG nova.network.neutron [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 8668fad5-f310-4dec-9960-6e26c28db100] Updating instance_info_cache with network_info: [{"id": "e6b4fe04-03d3-4ff9-b41e-73373dfb2f25", "address": "fa:16:3e:4a:89:04", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6b4fe04-03", "ovs_interfaceid": "e6b4fe04-03d3-4ff9-b41e-73373dfb2f25", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:28:56 compute-0 nova_compute[192698]: 2025-10-01 14:28:56.418 2 DEBUG oslo_concurrency.lockutils [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Releasing lock "refresh_cache-8668fad5-f310-4dec-9960-6e26c28db100" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 01 14:28:56 compute-0 nova_compute[192698]: 2025-10-01 14:28:56.940 2 DEBUG oslo_concurrency.lockutils [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:28:56 compute-0 nova_compute[192698]: 2025-10-01 14:28:56.941 2 DEBUG oslo_concurrency.lockutils [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:28:56 compute-0 nova_compute[192698]: 2025-10-01 14:28:56.941 2 DEBUG oslo_concurrency.lockutils [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:28:56 compute-0 nova_compute[192698]: 2025-10-01 14:28:56.947 2 INFO nova.virt.libvirt.driver [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 8668fad5-f310-4dec-9960-6e26c28db100] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Oct 01 14:28:56 compute-0 virtqemud[192597]: Domain id=20 name='instance-0000001b' uuid=8668fad5-f310-4dec-9960-6e26c28db100 is tainted: custom-monitor
Oct 01 14:28:57 compute-0 podman[226271]: 2025-10-01 14:28:57.154653065 +0000 UTC m=+0.074560017 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.build-date=20250930)
Oct 01 14:28:57 compute-0 podman[226272]: 2025-10-01 14:28:57.212844891 +0000 UTC m=+0.124733927 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct 01 14:28:57 compute-0 nova_compute[192698]: 2025-10-01 14:28:57.956 2 INFO nova.virt.libvirt.driver [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 8668fad5-f310-4dec-9960-6e26c28db100] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Oct 01 14:28:58 compute-0 nova_compute[192698]: 2025-10-01 14:28:58.965 2 INFO nova.virt.libvirt.driver [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 8668fad5-f310-4dec-9960-6e26c28db100] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Oct 01 14:28:58 compute-0 nova_compute[192698]: 2025-10-01 14:28:58.987 2 DEBUG nova.compute.manager [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 8668fad5-f310-4dec-9960-6e26c28db100] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 01 14:28:58 compute-0 nova_compute[192698]: 2025-10-01 14:28:58.988 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:28:59 compute-0 unix_chkpwd[226316]: password check failed for user (root)
Oct 01 14:28:59 compute-0 sshd-session[226269]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=101.47.181.100  user=root
Oct 01 14:28:59 compute-0 nova_compute[192698]: 2025-10-01 14:28:59.517 2 DEBUG nova.objects.instance [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 8668fad5-f310-4dec-9960-6e26c28db100] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Oct 01 14:28:59 compute-0 sshd-session[226200]: Connection closed by authenticating user root 101.47.181.100 port 54450 [preauth]
Oct 01 14:28:59 compute-0 nova_compute[192698]: 2025-10-01 14:28:59.678 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:28:59 compute-0 podman[203144]: time="2025-10-01T14:28:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:28:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:28:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20750 "" "Go-http-client/1.1"
Oct 01 14:28:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:28:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3492 "" "Go-http-client/1.1"
Oct 01 14:29:00 compute-0 nova_compute[192698]: 2025-10-01 14:29:00.539 2 WARNING neutronclient.v2_0.client [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:29:00 compute-0 nova_compute[192698]: 2025-10-01 14:29:00.679 2 WARNING neutronclient.v2_0.client [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:29:00 compute-0 nova_compute[192698]: 2025-10-01 14:29:00.679 2 WARNING neutronclient.v2_0.client [None req-6da4904a-b371-4baf-ad4b-3d9d59750c6c a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:29:01 compute-0 sshd-session[226269]: Failed password for root from 101.47.181.100 port 54460 ssh2
Oct 01 14:29:01 compute-0 openstack_network_exporter[205307]: ERROR   14:29:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:29:01 compute-0 openstack_network_exporter[205307]: ERROR   14:29:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:29:01 compute-0 openstack_network_exporter[205307]: ERROR   14:29:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:29:01 compute-0 openstack_network_exporter[205307]: ERROR   14:29:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:29:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:29:01 compute-0 openstack_network_exporter[205307]: ERROR   14:29:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:29:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:29:02 compute-0 sshd-session[226269]: Connection closed by authenticating user root 101.47.181.100 port 54460 [preauth]
Oct 01 14:29:03 compute-0 nova_compute[192698]: 2025-10-01 14:29:03.990 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:29:04 compute-0 podman[226319]: 2025-10-01 14:29:04.141940329 +0000 UTC m=+0.061560937 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, architecture=x86_64, name=ubi9-minimal, version=9.6, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Oct 01 14:29:04 compute-0 unix_chkpwd[226341]: password check failed for user (root)
Oct 01 14:29:04 compute-0 sshd-session[226317]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=101.47.181.100  user=root
Oct 01 14:29:04 compute-0 nova_compute[192698]: 2025-10-01 14:29:04.680 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:29:06 compute-0 sshd-session[226317]: Failed password for root from 101.47.181.100 port 44986 ssh2
Oct 01 14:29:07 compute-0 sshd-session[226317]: Connection closed by authenticating user root 101.47.181.100 port 44986 [preauth]
Oct 01 14:29:08 compute-0 nova_compute[192698]: 2025-10-01 14:29:08.852 2 DEBUG nova.compute.manager [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpyj7glsxk',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='247a480d-9e4d-4832-9646-d263b8a3035e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Oct 01 14:29:09 compute-0 nova_compute[192698]: 2025-10-01 14:29:09.037 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:29:09 compute-0 nova_compute[192698]: 2025-10-01 14:29:09.683 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:29:09 compute-0 nova_compute[192698]: 2025-10-01 14:29:09.875 2 DEBUG oslo_concurrency.lockutils [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "refresh_cache-247a480d-9e4d-4832-9646-d263b8a3035e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 01 14:29:09 compute-0 nova_compute[192698]: 2025-10-01 14:29:09.876 2 DEBUG oslo_concurrency.lockutils [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquired lock "refresh_cache-247a480d-9e4d-4832-9646-d263b8a3035e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 01 14:29:09 compute-0 nova_compute[192698]: 2025-10-01 14:29:09.876 2 DEBUG nova.network.neutron [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 247a480d-9e4d-4832-9646-d263b8a3035e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 01 14:29:10 compute-0 nova_compute[192698]: 2025-10-01 14:29:10.385 2 WARNING neutronclient.v2_0.client [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:29:10 compute-0 nova_compute[192698]: 2025-10-01 14:29:10.799 2 WARNING neutronclient.v2_0.client [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:29:11 compute-0 podman[226345]: 2025-10-01 14:29:11.172944193 +0000 UTC m=+0.077789854 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct 01 14:29:11 compute-0 podman[226344]: 2025-10-01 14:29:11.182965982 +0000 UTC m=+0.088930353 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930)
Oct 01 14:29:11 compute-0 unix_chkpwd[226382]: password check failed for user (root)
Oct 01 14:29:11 compute-0 sshd-session[226342]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=101.47.181.100  user=root
Oct 01 14:29:11 compute-0 nova_compute[192698]: 2025-10-01 14:29:11.616 2 DEBUG nova.network.neutron [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 247a480d-9e4d-4832-9646-d263b8a3035e] Updating instance_info_cache with network_info: [{"id": "646ff317-9aed-4b52-80f0-ae16e4a76056", "address": "fa:16:3e:f1:ea:d0", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap646ff317-9a", "ovs_interfaceid": "646ff317-9aed-4b52-80f0-ae16e4a76056", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:29:12 compute-0 nova_compute[192698]: 2025-10-01 14:29:12.123 2 DEBUG oslo_concurrency.lockutils [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Releasing lock "refresh_cache-247a480d-9e4d-4832-9646-d263b8a3035e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 01 14:29:12 compute-0 nova_compute[192698]: 2025-10-01 14:29:12.138 2 DEBUG nova.virt.libvirt.driver [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 247a480d-9e4d-4832-9646-d263b8a3035e] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpyj7glsxk',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='247a480d-9e4d-4832-9646-d263b8a3035e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Oct 01 14:29:12 compute-0 nova_compute[192698]: 2025-10-01 14:29:12.138 2 DEBUG nova.virt.libvirt.driver [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 247a480d-9e4d-4832-9646-d263b8a3035e] Creating instance directory: /var/lib/nova/instances/247a480d-9e4d-4832-9646-d263b8a3035e pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Oct 01 14:29:12 compute-0 nova_compute[192698]: 2025-10-01 14:29:12.139 2 DEBUG nova.virt.libvirt.driver [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 247a480d-9e4d-4832-9646-d263b8a3035e] Creating disk.info with the contents: {'/var/lib/nova/instances/247a480d-9e4d-4832-9646-d263b8a3035e/disk': 'qcow2', '/var/lib/nova/instances/247a480d-9e4d-4832-9646-d263b8a3035e/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Oct 01 14:29:12 compute-0 nova_compute[192698]: 2025-10-01 14:29:12.139 2 DEBUG nova.virt.libvirt.driver [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 247a480d-9e4d-4832-9646-d263b8a3035e] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Oct 01 14:29:12 compute-0 nova_compute[192698]: 2025-10-01 14:29:12.139 2 DEBUG nova.objects.instance [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 247a480d-9e4d-4832-9646-d263b8a3035e obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:29:12 compute-0 nova_compute[192698]: 2025-10-01 14:29:12.645 2 DEBUG oslo_utils.imageutils.format_inspector [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:29:12 compute-0 nova_compute[192698]: 2025-10-01 14:29:12.653 2 DEBUG oslo_utils.imageutils.format_inspector [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:29:12 compute-0 nova_compute[192698]: 2025-10-01 14:29:12.655 2 DEBUG oslo_concurrency.processutils [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:29:12 compute-0 nova_compute[192698]: 2025-10-01 14:29:12.737 2 DEBUG oslo_concurrency.processutils [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:29:12 compute-0 nova_compute[192698]: 2025-10-01 14:29:12.738 2 DEBUG oslo_concurrency.lockutils [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "f477473ce09fdc00484ca839f539813eb2fee546" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:29:12 compute-0 nova_compute[192698]: 2025-10-01 14:29:12.739 2 DEBUG oslo_concurrency.lockutils [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "f477473ce09fdc00484ca839f539813eb2fee546" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:29:12 compute-0 nova_compute[192698]: 2025-10-01 14:29:12.740 2 DEBUG oslo_utils.imageutils.format_inspector [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:29:12 compute-0 nova_compute[192698]: 2025-10-01 14:29:12.746 2 DEBUG oslo_utils.imageutils.format_inspector [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:29:12 compute-0 nova_compute[192698]: 2025-10-01 14:29:12.747 2 DEBUG oslo_concurrency.processutils [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:29:12 compute-0 nova_compute[192698]: 2025-10-01 14:29:12.812 2 DEBUG oslo_concurrency.processutils [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:29:12 compute-0 nova_compute[192698]: 2025-10-01 14:29:12.813 2 DEBUG oslo_concurrency.processutils [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546,backing_fmt=raw /var/lib/nova/instances/247a480d-9e4d-4832-9646-d263b8a3035e/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:29:12 compute-0 nova_compute[192698]: 2025-10-01 14:29:12.848 2 DEBUG oslo_concurrency.processutils [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546,backing_fmt=raw /var/lib/nova/instances/247a480d-9e4d-4832-9646-d263b8a3035e/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:29:12 compute-0 nova_compute[192698]: 2025-10-01 14:29:12.849 2 DEBUG oslo_concurrency.lockutils [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "f477473ce09fdc00484ca839f539813eb2fee546" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.110s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:29:12 compute-0 nova_compute[192698]: 2025-10-01 14:29:12.849 2 DEBUG oslo_concurrency.processutils [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:29:12 compute-0 nova_compute[192698]: 2025-10-01 14:29:12.911 2 DEBUG oslo_concurrency.processutils [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:29:12 compute-0 nova_compute[192698]: 2025-10-01 14:29:12.912 2 DEBUG nova.virt.disk.api [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Checking if we can resize image /var/lib/nova/instances/247a480d-9e4d-4832-9646-d263b8a3035e/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 01 14:29:12 compute-0 nova_compute[192698]: 2025-10-01 14:29:12.912 2 DEBUG oslo_concurrency.processutils [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/247a480d-9e4d-4832-9646-d263b8a3035e/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:29:12 compute-0 nova_compute[192698]: 2025-10-01 14:29:12.964 2 DEBUG oslo_concurrency.processutils [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/247a480d-9e4d-4832-9646-d263b8a3035e/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:29:12 compute-0 nova_compute[192698]: 2025-10-01 14:29:12.965 2 DEBUG nova.virt.disk.api [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Cannot resize image /var/lib/nova/instances/247a480d-9e4d-4832-9646-d263b8a3035e/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 01 14:29:12 compute-0 nova_compute[192698]: 2025-10-01 14:29:12.965 2 DEBUG nova.objects.instance [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lazy-loading 'migration_context' on Instance uuid 247a480d-9e4d-4832-9646-d263b8a3035e obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:29:13 compute-0 sshd-session[226342]: Failed password for root from 101.47.181.100 port 42784 ssh2
Oct 01 14:29:13 compute-0 nova_compute[192698]: 2025-10-01 14:29:13.474 2 DEBUG nova.objects.base [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Object Instance<247a480d-9e4d-4832-9646-d263b8a3035e> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Oct 01 14:29:13 compute-0 nova_compute[192698]: 2025-10-01 14:29:13.475 2 DEBUG oslo_concurrency.processutils [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/247a480d-9e4d-4832-9646-d263b8a3035e/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:29:13 compute-0 nova_compute[192698]: 2025-10-01 14:29:13.514 2 DEBUG oslo_concurrency.processutils [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/247a480d-9e4d-4832-9646-d263b8a3035e/disk.config 497664" returned: 0 in 0.038s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:29:13 compute-0 nova_compute[192698]: 2025-10-01 14:29:13.515 2 DEBUG nova.virt.libvirt.driver [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 247a480d-9e4d-4832-9646-d263b8a3035e] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Oct 01 14:29:13 compute-0 nova_compute[192698]: 2025-10-01 14:29:13.517 2 DEBUG nova.virt.libvirt.vif [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-10-01T14:27:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1888530',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1888530',id=26,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-01T14:27:42Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d43115e3729442e1b68b749acc0dabc8',ramdisk_id='',reservation_id='r-rgo0d6jh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-30131345',owner_user_name='tempest-TestExecuteStrategies-30131345-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-10-01T14:27:42Z,user_data=None,user_id='f8897741e6ca4770b56d28d05fa3fc42',uuid=247a480d-9e4d-4832-9646-d263b8a3035e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "646ff317-9aed-4b52-80f0-ae16e4a76056", "address": "fa:16:3e:f1:ea:d0", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap646ff317-9a", "ovs_interfaceid": "646ff317-9aed-4b52-80f0-ae16e4a76056", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 01 14:29:13 compute-0 nova_compute[192698]: 2025-10-01 14:29:13.517 2 DEBUG nova.network.os_vif_util [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Converting VIF {"id": "646ff317-9aed-4b52-80f0-ae16e4a76056", "address": "fa:16:3e:f1:ea:d0", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap646ff317-9a", "ovs_interfaceid": "646ff317-9aed-4b52-80f0-ae16e4a76056", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:29:13 compute-0 nova_compute[192698]: 2025-10-01 14:29:13.518 2 DEBUG nova.network.os_vif_util [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f1:ea:d0,bridge_name='br-int',has_traffic_filtering=True,id=646ff317-9aed-4b52-80f0-ae16e4a76056,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap646ff317-9a') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:29:13 compute-0 nova_compute[192698]: 2025-10-01 14:29:13.519 2 DEBUG os_vif [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f1:ea:d0,bridge_name='br-int',has_traffic_filtering=True,id=646ff317-9aed-4b52-80f0-ae16e4a76056,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap646ff317-9a') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 01 14:29:13 compute-0 nova_compute[192698]: 2025-10-01 14:29:13.520 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:29:13 compute-0 nova_compute[192698]: 2025-10-01 14:29:13.520 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:29:13 compute-0 nova_compute[192698]: 2025-10-01 14:29:13.521 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:29:13 compute-0 nova_compute[192698]: 2025-10-01 14:29:13.523 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:29:13 compute-0 nova_compute[192698]: 2025-10-01 14:29:13.523 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'a4837eeb-ee81-528a-bac6-f3d6df5491d1', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:29:13 compute-0 nova_compute[192698]: 2025-10-01 14:29:13.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:29:13 compute-0 nova_compute[192698]: 2025-10-01 14:29:13.527 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:29:13 compute-0 nova_compute[192698]: 2025-10-01 14:29:13.531 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:29:13 compute-0 nova_compute[192698]: 2025-10-01 14:29:13.531 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap646ff317-9a, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:29:13 compute-0 nova_compute[192698]: 2025-10-01 14:29:13.532 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap646ff317-9a, col_values=(('qos', UUID('e16d1862-63c3-4640-b8d5-cabd149320d8')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:29:13 compute-0 nova_compute[192698]: 2025-10-01 14:29:13.532 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap646ff317-9a, col_values=(('external_ids', {'iface-id': '646ff317-9aed-4b52-80f0-ae16e4a76056', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f1:ea:d0', 'vm-uuid': '247a480d-9e4d-4832-9646-d263b8a3035e'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:29:13 compute-0 nova_compute[192698]: 2025-10-01 14:29:13.534 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:29:13 compute-0 NetworkManager[51741]: <info>  [1759328953.5359] manager: (tap646ff317-9a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/84)
Oct 01 14:29:13 compute-0 nova_compute[192698]: 2025-10-01 14:29:13.536 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:29:13 compute-0 nova_compute[192698]: 2025-10-01 14:29:13.545 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:29:13 compute-0 nova_compute[192698]: 2025-10-01 14:29:13.546 2 INFO os_vif [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f1:ea:d0,bridge_name='br-int',has_traffic_filtering=True,id=646ff317-9aed-4b52-80f0-ae16e4a76056,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap646ff317-9a')
Oct 01 14:29:13 compute-0 nova_compute[192698]: 2025-10-01 14:29:13.546 2 DEBUG nova.virt.libvirt.driver [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Oct 01 14:29:13 compute-0 nova_compute[192698]: 2025-10-01 14:29:13.547 2 DEBUG nova.compute.manager [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpyj7glsxk',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='247a480d-9e4d-4832-9646-d263b8a3035e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Oct 01 14:29:13 compute-0 nova_compute[192698]: 2025-10-01 14:29:13.548 2 WARNING neutronclient.v2_0.client [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:29:13 compute-0 nova_compute[192698]: 2025-10-01 14:29:13.679 2 WARNING neutronclient.v2_0.client [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:29:13 compute-0 nova_compute[192698]: 2025-10-01 14:29:13.989 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:29:13 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:29:13.989 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=27, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'e2:3f:3c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:1d:a6:67:ed:e6'}, ipsec=False) old=SB_Global(nb_cfg=26) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:29:13 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:29:13.991 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 01 14:29:14 compute-0 sshd-session[226342]: Connection closed by authenticating user root 101.47.181.100 port 42784 [preauth]
Oct 01 14:29:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:29:14.292 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:29:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:29:14.293 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:29:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:29:14.294 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:29:14 compute-0 nova_compute[192698]: 2025-10-01 14:29:14.686 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:29:14 compute-0 nova_compute[192698]: 2025-10-01 14:29:14.867 2 DEBUG nova.network.neutron [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 247a480d-9e4d-4832-9646-d263b8a3035e] Port 646ff317-9aed-4b52-80f0-ae16e4a76056 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Oct 01 14:29:14 compute-0 nova_compute[192698]: 2025-10-01 14:29:14.884 2 DEBUG nova.compute.manager [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpyj7glsxk',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='247a480d-9e4d-4832-9646-d263b8a3035e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Oct 01 14:29:17 compute-0 unix_chkpwd[226410]: password check failed for user (root)
Oct 01 14:29:17 compute-0 sshd-session[226407]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=101.47.181.100  user=root
Oct 01 14:29:17 compute-0 podman[226409]: 2025-10-01 14:29:17.174557233 +0000 UTC m=+0.082563092 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 01 14:29:17 compute-0 kernel: tap646ff317-9a: entered promiscuous mode
Oct 01 14:29:17 compute-0 NetworkManager[51741]: <info>  [1759328957.8691] manager: (tap646ff317-9a): new Tun device (/org/freedesktop/NetworkManager/Devices/85)
Oct 01 14:29:17 compute-0 nova_compute[192698]: 2025-10-01 14:29:17.870 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:29:17 compute-0 ovn_controller[94909]: 2025-10-01T14:29:17Z|00217|binding|INFO|Claiming lport 646ff317-9aed-4b52-80f0-ae16e4a76056 for this additional chassis.
Oct 01 14:29:17 compute-0 ovn_controller[94909]: 2025-10-01T14:29:17Z|00218|binding|INFO|646ff317-9aed-4b52-80f0-ae16e4a76056: Claiming fa:16:3e:f1:ea:d0 10.100.0.11
Oct 01 14:29:17 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:29:17.881 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f1:ea:d0 10.100.0.11'], port_security=['fa:16:3e:f1:ea:d0 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '247a480d-9e4d-4832-9646-d263b8a3035e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-031a8987-8430-4fb6-a464-01e4dca2fae7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd43115e3729442e1b68b749acc0dabc8', 'neutron:revision_number': '10', 'neutron:security_group_ids': '43a3232d-93b1-43af-a9a3-1fde49b4460d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bd1914da-f1b0-4097-9d6b-24a3870871dc, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=646ff317-9aed-4b52-80f0-ae16e4a76056) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:29:17 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:29:17.882 103791 INFO neutron.agent.ovn.metadata.agent [-] Port 646ff317-9aed-4b52-80f0-ae16e4a76056 in datapath 031a8987-8430-4fb6-a464-01e4dca2fae7 unbound from our chassis
Oct 01 14:29:17 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:29:17.883 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 031a8987-8430-4fb6-a464-01e4dca2fae7
Oct 01 14:29:17 compute-0 ovn_controller[94909]: 2025-10-01T14:29:17Z|00219|binding|INFO|Setting lport 646ff317-9aed-4b52-80f0-ae16e4a76056 ovn-installed in OVS
Oct 01 14:29:17 compute-0 nova_compute[192698]: 2025-10-01 14:29:17.885 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:29:17 compute-0 nova_compute[192698]: 2025-10-01 14:29:17.886 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:29:17 compute-0 nova_compute[192698]: 2025-10-01 14:29:17.891 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:29:17 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:29:17.907 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[35b00905-4076-4eac-9021-ac9e51ee0cf9]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:29:17 compute-0 systemd-udevd[226450]: Network interface NamePolicy= disabled on kernel command line.
Oct 01 14:29:17 compute-0 systemd-machined[152704]: New machine qemu-21-instance-0000001a.
Oct 01 14:29:17 compute-0 NetworkManager[51741]: <info>  [1759328957.9319] device (tap646ff317-9a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 01 14:29:17 compute-0 NetworkManager[51741]: <info>  [1759328957.9344] device (tap646ff317-9a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 01 14:29:17 compute-0 systemd[1]: Started Virtual Machine qemu-21-instance-0000001a.
Oct 01 14:29:17 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:29:17.959 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[e1e8c0c4-815f-435f-ae07-c2a6ccddb64c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:29:17 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:29:17.963 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[8700f63d-c3c5-4a87-9999-0aac220c660e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:29:18 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:29:18.003 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[ee373400-15f8-415a-9dc0-7198f3b5097f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:29:18 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:29:18.025 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[d4c6614e-15aa-49c5-a5bc-32973b0a94df]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap031a8987-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:79:6c:81'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 5, 'rx_bytes': 1672, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 5, 'rx_bytes': 1672, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 62], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 528958, 'reachable_time': 33985, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226463, 'error': None, 'target': 'ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:29:18 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:29:18.045 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[a8521887-f6f7-4585-a34f-0d905a359eeb]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap031a8987-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 528976, 'tstamp': 528976}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226465, 'error': None, 'target': 'ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap031a8987-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 528980, 'tstamp': 528980}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226465, 'error': None, 'target': 'ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:29:18 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:29:18.047 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap031a8987-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:29:18 compute-0 nova_compute[192698]: 2025-10-01 14:29:18.079 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:29:18 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:29:18.082 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap031a8987-80, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:29:18 compute-0 nova_compute[192698]: 2025-10-01 14:29:18.082 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:29:18 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:29:18.082 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:29:18 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:29:18.083 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap031a8987-80, col_values=(('external_ids', {'iface-id': '6dd814dc-cba2-4392-85ef-eadb8c4615f7'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:29:18 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:29:18.083 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:29:18 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:29:18.085 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[cb9bca63-843c-434c-9e43-d7e30407a643]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-031a8987-8430-4fb6-a464-01e4dca2fae7\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 031a8987-8430-4fb6-a464-01e4dca2fae7\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:29:18 compute-0 nova_compute[192698]: 2025-10-01 14:29:18.534 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:29:19 compute-0 sshd-session[226407]: Failed password for root from 101.47.181.100 port 42786 ssh2
Oct 01 14:29:19 compute-0 nova_compute[192698]: 2025-10-01 14:29:19.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:29:19 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:29:19.992 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=10cf9814-09fa-4bad-879a-270f9b64eda3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '27'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:29:20 compute-0 ovn_controller[94909]: 2025-10-01T14:29:20Z|00220|binding|INFO|Claiming lport 646ff317-9aed-4b52-80f0-ae16e4a76056 for this chassis.
Oct 01 14:29:20 compute-0 ovn_controller[94909]: 2025-10-01T14:29:20Z|00221|binding|INFO|646ff317-9aed-4b52-80f0-ae16e4a76056: Claiming fa:16:3e:f1:ea:d0 10.100.0.11
Oct 01 14:29:20 compute-0 ovn_controller[94909]: 2025-10-01T14:29:20Z|00222|binding|INFO|Setting lport 646ff317-9aed-4b52-80f0-ae16e4a76056 up in Southbound
Oct 01 14:29:21 compute-0 sshd-session[226407]: Connection closed by authenticating user root 101.47.181.100 port 42786 [preauth]
Oct 01 14:29:22 compute-0 nova_compute[192698]: 2025-10-01 14:29:22.613 2 INFO nova.compute.manager [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 247a480d-9e4d-4832-9646-d263b8a3035e] Post operation of migration started
Oct 01 14:29:22 compute-0 nova_compute[192698]: 2025-10-01 14:29:22.614 2 WARNING neutronclient.v2_0.client [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:29:22 compute-0 nova_compute[192698]: 2025-10-01 14:29:22.750 2 WARNING neutronclient.v2_0.client [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:29:22 compute-0 nova_compute[192698]: 2025-10-01 14:29:22.751 2 WARNING neutronclient.v2_0.client [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:29:22 compute-0 nova_compute[192698]: 2025-10-01 14:29:22.819 2 DEBUG oslo_concurrency.lockutils [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "refresh_cache-247a480d-9e4d-4832-9646-d263b8a3035e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 01 14:29:22 compute-0 nova_compute[192698]: 2025-10-01 14:29:22.819 2 DEBUG oslo_concurrency.lockutils [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquired lock "refresh_cache-247a480d-9e4d-4832-9646-d263b8a3035e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 01 14:29:22 compute-0 nova_compute[192698]: 2025-10-01 14:29:22.820 2 DEBUG nova.network.neutron [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 247a480d-9e4d-4832-9646-d263b8a3035e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 01 14:29:23 compute-0 nova_compute[192698]: 2025-10-01 14:29:23.328 2 WARNING neutronclient.v2_0.client [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:29:23 compute-0 nova_compute[192698]: 2025-10-01 14:29:23.538 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:29:23 compute-0 nova_compute[192698]: 2025-10-01 14:29:23.936 2 WARNING neutronclient.v2_0.client [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:29:24 compute-0 nova_compute[192698]: 2025-10-01 14:29:24.157 2 DEBUG nova.network.neutron [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 247a480d-9e4d-4832-9646-d263b8a3035e] Updating instance_info_cache with network_info: [{"id": "646ff317-9aed-4b52-80f0-ae16e4a76056", "address": "fa:16:3e:f1:ea:d0", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap646ff317-9a", "ovs_interfaceid": "646ff317-9aed-4b52-80f0-ae16e4a76056", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:29:24 compute-0 unix_chkpwd[226490]: password check failed for user (root)
Oct 01 14:29:24 compute-0 sshd-session[226488]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=101.47.181.100  user=root
Oct 01 14:29:24 compute-0 nova_compute[192698]: 2025-10-01 14:29:24.663 2 DEBUG oslo_concurrency.lockutils [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Releasing lock "refresh_cache-247a480d-9e4d-4832-9646-d263b8a3035e" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 01 14:29:24 compute-0 nova_compute[192698]: 2025-10-01 14:29:24.692 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:29:25 compute-0 nova_compute[192698]: 2025-10-01 14:29:25.185 2 DEBUG oslo_concurrency.lockutils [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:29:25 compute-0 nova_compute[192698]: 2025-10-01 14:29:25.186 2 DEBUG oslo_concurrency.lockutils [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:29:25 compute-0 nova_compute[192698]: 2025-10-01 14:29:25.187 2 DEBUG oslo_concurrency.lockutils [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:29:25 compute-0 nova_compute[192698]: 2025-10-01 14:29:25.195 2 INFO nova.virt.libvirt.driver [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 247a480d-9e4d-4832-9646-d263b8a3035e] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Oct 01 14:29:25 compute-0 virtqemud[192597]: Domain id=21 name='instance-0000001a' uuid=247a480d-9e4d-4832-9646-d263b8a3035e is tainted: custom-monitor
Oct 01 14:29:26 compute-0 nova_compute[192698]: 2025-10-01 14:29:26.206 2 INFO nova.virt.libvirt.driver [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 247a480d-9e4d-4832-9646-d263b8a3035e] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Oct 01 14:29:26 compute-0 nova_compute[192698]: 2025-10-01 14:29:26.434 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:29:26 compute-0 sshd-session[226488]: Failed password for root from 101.47.181.100 port 47362 ssh2
Oct 01 14:29:27 compute-0 nova_compute[192698]: 2025-10-01 14:29:27.213 2 INFO nova.virt.libvirt.driver [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 247a480d-9e4d-4832-9646-d263b8a3035e] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Oct 01 14:29:27 compute-0 nova_compute[192698]: 2025-10-01 14:29:27.217 2 DEBUG nova.compute.manager [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 247a480d-9e4d-4832-9646-d263b8a3035e] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 01 14:29:27 compute-0 nova_compute[192698]: 2025-10-01 14:29:27.726 2 DEBUG nova.objects.instance [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 247a480d-9e4d-4832-9646-d263b8a3035e] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Oct 01 14:29:28 compute-0 podman[226491]: 2025-10-01 14:29:28.20614216 +0000 UTC m=+0.096112097 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_id=ovn_metadata_agent, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Oct 01 14:29:28 compute-0 podman[226492]: 2025-10-01 14:29:28.242813886 +0000 UTC m=+0.130407869 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.4)
Oct 01 14:29:28 compute-0 nova_compute[192698]: 2025-10-01 14:29:28.540 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:29:28 compute-0 nova_compute[192698]: 2025-10-01 14:29:28.745 2 WARNING neutronclient.v2_0.client [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:29:29 compute-0 nova_compute[192698]: 2025-10-01 14:29:29.341 2 WARNING neutronclient.v2_0.client [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:29:29 compute-0 nova_compute[192698]: 2025-10-01 14:29:29.342 2 WARNING neutronclient.v2_0.client [None req-6b5e0af6-fa58-4fa6-81c9-540bd929d8c3 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:29:29 compute-0 unix_chkpwd[226534]: password check failed for user (root)
Oct 01 14:29:29 compute-0 sshd-session[226532]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.99  user=root
Oct 01 14:29:29 compute-0 nova_compute[192698]: 2025-10-01 14:29:29.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:29:29 compute-0 podman[203144]: time="2025-10-01T14:29:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:29:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:29:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20750 "" "Go-http-client/1.1"
Oct 01 14:29:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:29:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3492 "" "Go-http-client/1.1"
Oct 01 14:29:29 compute-0 nova_compute[192698]: 2025-10-01 14:29:29.924 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:29:29 compute-0 nova_compute[192698]: 2025-10-01 14:29:29.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:29:30 compute-0 nova_compute[192698]: 2025-10-01 14:29:30.437 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:29:30 compute-0 nova_compute[192698]: 2025-10-01 14:29:30.437 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:29:30 compute-0 nova_compute[192698]: 2025-10-01 14:29:30.438 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:29:30 compute-0 nova_compute[192698]: 2025-10-01 14:29:30.438 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 01 14:29:31 compute-0 openstack_network_exporter[205307]: ERROR   14:29:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:29:31 compute-0 openstack_network_exporter[205307]: ERROR   14:29:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:29:31 compute-0 openstack_network_exporter[205307]: ERROR   14:29:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:29:31 compute-0 openstack_network_exporter[205307]: ERROR   14:29:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:29:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:29:31 compute-0 openstack_network_exporter[205307]: ERROR   14:29:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:29:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:29:31 compute-0 sshd-session[226532]: Failed password for root from 193.46.255.99 port 50144 ssh2
Oct 01 14:29:31 compute-0 nova_compute[192698]: 2025-10-01 14:29:31.501 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/247a480d-9e4d-4832-9646-d263b8a3035e/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:29:31 compute-0 nova_compute[192698]: 2025-10-01 14:29:31.592 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/247a480d-9e4d-4832-9646-d263b8a3035e/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:29:31 compute-0 nova_compute[192698]: 2025-10-01 14:29:31.593 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/247a480d-9e4d-4832-9646-d263b8a3035e/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:29:31 compute-0 nova_compute[192698]: 2025-10-01 14:29:31.677 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/247a480d-9e4d-4832-9646-d263b8a3035e/disk --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:29:31 compute-0 nova_compute[192698]: 2025-10-01 14:29:31.684 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8668fad5-f310-4dec-9960-6e26c28db100/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:29:31 compute-0 nova_compute[192698]: 2025-10-01 14:29:31.740 2 DEBUG oslo_concurrency.lockutils [None req-8c8b038c-54ba-432b-9612-834318b40830 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Acquiring lock "8668fad5-f310-4dec-9960-6e26c28db100" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:29:31 compute-0 nova_compute[192698]: 2025-10-01 14:29:31.741 2 DEBUG oslo_concurrency.lockutils [None req-8c8b038c-54ba-432b-9612-834318b40830 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "8668fad5-f310-4dec-9960-6e26c28db100" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:29:31 compute-0 nova_compute[192698]: 2025-10-01 14:29:31.742 2 DEBUG oslo_concurrency.lockutils [None req-8c8b038c-54ba-432b-9612-834318b40830 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Acquiring lock "8668fad5-f310-4dec-9960-6e26c28db100-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:29:31 compute-0 nova_compute[192698]: 2025-10-01 14:29:31.742 2 DEBUG oslo_concurrency.lockutils [None req-8c8b038c-54ba-432b-9612-834318b40830 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "8668fad5-f310-4dec-9960-6e26c28db100-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:29:31 compute-0 nova_compute[192698]: 2025-10-01 14:29:31.742 2 DEBUG oslo_concurrency.lockutils [None req-8c8b038c-54ba-432b-9612-834318b40830 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "8668fad5-f310-4dec-9960-6e26c28db100-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:29:31 compute-0 nova_compute[192698]: 2025-10-01 14:29:31.762 2 INFO nova.compute.manager [None req-8c8b038c-54ba-432b-9612-834318b40830 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 8668fad5-f310-4dec-9960-6e26c28db100] Terminating instance
Oct 01 14:29:31 compute-0 nova_compute[192698]: 2025-10-01 14:29:31.778 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8668fad5-f310-4dec-9960-6e26c28db100/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:29:31 compute-0 nova_compute[192698]: 2025-10-01 14:29:31.779 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8668fad5-f310-4dec-9960-6e26c28db100/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:29:31 compute-0 nova_compute[192698]: 2025-10-01 14:29:31.866 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8668fad5-f310-4dec-9960-6e26c28db100/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:29:32 compute-0 nova_compute[192698]: 2025-10-01 14:29:32.074 2 WARNING nova.virt.libvirt.driver [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:29:32 compute-0 nova_compute[192698]: 2025-10-01 14:29:32.076 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:29:32 compute-0 nova_compute[192698]: 2025-10-01 14:29:32.105 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.029s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:29:32 compute-0 nova_compute[192698]: 2025-10-01 14:29:32.107 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5539MB free_disk=73.24490356445312GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 01 14:29:32 compute-0 nova_compute[192698]: 2025-10-01 14:29:32.107 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:29:32 compute-0 nova_compute[192698]: 2025-10-01 14:29:32.108 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:29:32 compute-0 unix_chkpwd[226549]: password check failed for user (root)
Oct 01 14:29:32 compute-0 nova_compute[192698]: 2025-10-01 14:29:32.290 2 DEBUG nova.compute.manager [None req-8c8b038c-54ba-432b-9612-834318b40830 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 8668fad5-f310-4dec-9960-6e26c28db100] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 01 14:29:32 compute-0 kernel: tape6b4fe04-03 (unregistering): left promiscuous mode
Oct 01 14:29:32 compute-0 NetworkManager[51741]: <info>  [1759328972.3226] device (tape6b4fe04-03): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 01 14:29:32 compute-0 ovn_controller[94909]: 2025-10-01T14:29:32Z|00223|binding|INFO|Releasing lport e6b4fe04-03d3-4ff9-b41e-73373dfb2f25 from this chassis (sb_readonly=0)
Oct 01 14:29:32 compute-0 nova_compute[192698]: 2025-10-01 14:29:32.338 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:29:32 compute-0 ovn_controller[94909]: 2025-10-01T14:29:32Z|00224|binding|INFO|Setting lport e6b4fe04-03d3-4ff9-b41e-73373dfb2f25 down in Southbound
Oct 01 14:29:32 compute-0 ovn_controller[94909]: 2025-10-01T14:29:32Z|00225|binding|INFO|Removing iface tape6b4fe04-03 ovn-installed in OVS
Oct 01 14:29:32 compute-0 nova_compute[192698]: 2025-10-01 14:29:32.342 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:29:32 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:29:32.365 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4a:89:04 10.100.0.7'], port_security=['fa:16:3e:4a:89:04 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '8668fad5-f310-4dec-9960-6e26c28db100', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-031a8987-8430-4fb6-a464-01e4dca2fae7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd43115e3729442e1b68b749acc0dabc8', 'neutron:revision_number': '14', 'neutron:security_group_ids': '43a3232d-93b1-43af-a9a3-1fde49b4460d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bd1914da-f1b0-4097-9d6b-24a3870871dc, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], logical_port=e6b4fe04-03d3-4ff9-b41e-73373dfb2f25) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:29:32 compute-0 nova_compute[192698]: 2025-10-01 14:29:32.367 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:29:32 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:29:32.367 103791 INFO neutron.agent.ovn.metadata.agent [-] Port e6b4fe04-03d3-4ff9-b41e-73373dfb2f25 in datapath 031a8987-8430-4fb6-a464-01e4dca2fae7 unbound from our chassis
Oct 01 14:29:32 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:29:32.369 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 031a8987-8430-4fb6-a464-01e4dca2fae7
Oct 01 14:29:32 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:29:32.394 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[1d55259c-4ed1-467f-b695-ba86aba7c5be]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:29:32 compute-0 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d0000001b.scope: Deactivated successfully.
Oct 01 14:29:32 compute-0 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d0000001b.scope: Consumed 3.912s CPU time.
Oct 01 14:29:32 compute-0 systemd-machined[152704]: Machine qemu-20-instance-0000001b terminated.
Oct 01 14:29:32 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:29:32.443 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[711e4699-b839-4460-b2fc-874b074b9540]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:29:32 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:29:32.447 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[4ef657cc-b050-4a9e-b976-c91ad045c21e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:29:32 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:29:32.496 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[48d48d9b-b895-48a6-a2be-fc1fb9a17da8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:29:32 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:29:32.524 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[b4ed8396-f893-4873-bbe2-8a2a80353b94]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap031a8987-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:79:6c:81'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 48, 'tx_packets': 7, 'rx_bytes': 2512, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 48, 'tx_packets': 7, 'rx_bytes': 2512, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 62], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 528958, 'reachable_time': 33985, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226563, 'error': None, 'target': 'ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:29:32 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:29:32.549 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[de36eea3-69d0-4fe6-bfa0-b616c0083ffd]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap031a8987-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 528976, 'tstamp': 528976}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226570, 'error': None, 'target': 'ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap031a8987-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 528980, 'tstamp': 528980}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226570, 'error': None, 'target': 'ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:29:32 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:29:32.551 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap031a8987-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:29:32 compute-0 nova_compute[192698]: 2025-10-01 14:29:32.553 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:29:32 compute-0 nova_compute[192698]: 2025-10-01 14:29:32.558 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:29:32 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:29:32.558 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap031a8987-80, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:29:32 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:29:32.558 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:29:32 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:29:32.559 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap031a8987-80, col_values=(('external_ids', {'iface-id': '6dd814dc-cba2-4392-85ef-eadb8c4615f7'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:29:32 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:29:32.559 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:29:32 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:29:32.560 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[a579471c-5a5a-4d08-b977-5cefbcfcb0ca]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-031a8987-8430-4fb6-a464-01e4dca2fae7\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 031a8987-8430-4fb6-a464-01e4dca2fae7\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:29:32 compute-0 nova_compute[192698]: 2025-10-01 14:29:32.584 2 INFO nova.virt.libvirt.driver [-] [instance: 8668fad5-f310-4dec-9960-6e26c28db100] Instance destroyed successfully.
Oct 01 14:29:32 compute-0 nova_compute[192698]: 2025-10-01 14:29:32.586 2 DEBUG nova.objects.instance [None req-8c8b038c-54ba-432b-9612-834318b40830 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lazy-loading 'resources' on Instance uuid 8668fad5-f310-4dec-9960-6e26c28db100 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:29:32 compute-0 nova_compute[192698]: 2025-10-01 14:29:32.662 2 DEBUG nova.compute.manager [req-5614edbe-e35c-4b7a-a078-956ef21eb255 req-8bb62cf8-ef03-4e94-82ad-c549e78d3832 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 8668fad5-f310-4dec-9960-6e26c28db100] Received event network-vif-unplugged-e6b4fe04-03d3-4ff9-b41e-73373dfb2f25 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:29:32 compute-0 nova_compute[192698]: 2025-10-01 14:29:32.663 2 DEBUG oslo_concurrency.lockutils [req-5614edbe-e35c-4b7a-a078-956ef21eb255 req-8bb62cf8-ef03-4e94-82ad-c549e78d3832 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "8668fad5-f310-4dec-9960-6e26c28db100-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:29:32 compute-0 nova_compute[192698]: 2025-10-01 14:29:32.663 2 DEBUG oslo_concurrency.lockutils [req-5614edbe-e35c-4b7a-a078-956ef21eb255 req-8bb62cf8-ef03-4e94-82ad-c549e78d3832 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "8668fad5-f310-4dec-9960-6e26c28db100-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:29:32 compute-0 nova_compute[192698]: 2025-10-01 14:29:32.664 2 DEBUG oslo_concurrency.lockutils [req-5614edbe-e35c-4b7a-a078-956ef21eb255 req-8bb62cf8-ef03-4e94-82ad-c549e78d3832 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "8668fad5-f310-4dec-9960-6e26c28db100-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:29:32 compute-0 nova_compute[192698]: 2025-10-01 14:29:32.664 2 DEBUG nova.compute.manager [req-5614edbe-e35c-4b7a-a078-956ef21eb255 req-8bb62cf8-ef03-4e94-82ad-c549e78d3832 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 8668fad5-f310-4dec-9960-6e26c28db100] No waiting events found dispatching network-vif-unplugged-e6b4fe04-03d3-4ff9-b41e-73373dfb2f25 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:29:32 compute-0 nova_compute[192698]: 2025-10-01 14:29:32.664 2 DEBUG nova.compute.manager [req-5614edbe-e35c-4b7a-a078-956ef21eb255 req-8bb62cf8-ef03-4e94-82ad-c549e78d3832 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 8668fad5-f310-4dec-9960-6e26c28db100] Received event network-vif-unplugged-e6b4fe04-03d3-4ff9-b41e-73373dfb2f25 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 01 14:29:33 compute-0 nova_compute[192698]: 2025-10-01 14:29:33.092 2 DEBUG nova.virt.libvirt.vif [None req-8c8b038c-54ba-432b-9612-834318b40830 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-10-01T14:27:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1324074072',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1324074072',id=27,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-01T14:28:04Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d43115e3729442e1b68b749acc0dabc8',ramdisk_id='',reservation_id='r-th77tlj1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',clean_attempts='1',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-30131345',owner_user_name='tempest-TestExecuteStrategies-30131345-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-01T14:29:00Z,user_data=None,user_id='f8897741e6ca4770b56d28d05fa3fc42',uuid=8668fad5-f310-4dec-9960-6e26c28db100,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e6b4fe04-03d3-4ff9-b41e-73373dfb2f25", "address": "fa:16:3e:4a:89:04", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6b4fe04-03", "ovs_interfaceid": "e6b4fe04-03d3-4ff9-b41e-73373dfb2f25", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 01 14:29:33 compute-0 nova_compute[192698]: 2025-10-01 14:29:33.093 2 DEBUG nova.network.os_vif_util [None req-8c8b038c-54ba-432b-9612-834318b40830 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Converting VIF {"id": "e6b4fe04-03d3-4ff9-b41e-73373dfb2f25", "address": "fa:16:3e:4a:89:04", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape6b4fe04-03", "ovs_interfaceid": "e6b4fe04-03d3-4ff9-b41e-73373dfb2f25", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:29:33 compute-0 nova_compute[192698]: 2025-10-01 14:29:33.094 2 DEBUG nova.network.os_vif_util [None req-8c8b038c-54ba-432b-9612-834318b40830 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:4a:89:04,bridge_name='br-int',has_traffic_filtering=True,id=e6b4fe04-03d3-4ff9-b41e-73373dfb2f25,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape6b4fe04-03') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:29:33 compute-0 nova_compute[192698]: 2025-10-01 14:29:33.095 2 DEBUG os_vif [None req-8c8b038c-54ba-432b-9612-834318b40830 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:4a:89:04,bridge_name='br-int',has_traffic_filtering=True,id=e6b4fe04-03d3-4ff9-b41e-73373dfb2f25,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape6b4fe04-03') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 01 14:29:33 compute-0 nova_compute[192698]: 2025-10-01 14:29:33.098 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:29:33 compute-0 nova_compute[192698]: 2025-10-01 14:29:33.099 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape6b4fe04-03, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:29:33 compute-0 nova_compute[192698]: 2025-10-01 14:29:33.101 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:29:33 compute-0 nova_compute[192698]: 2025-10-01 14:29:33.103 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:29:33 compute-0 nova_compute[192698]: 2025-10-01 14:29:33.104 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:29:33 compute-0 nova_compute[192698]: 2025-10-01 14:29:33.104 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=0c3b76d3-6571-4116-9585-4ced4ccd6eb1) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:29:33 compute-0 nova_compute[192698]: 2025-10-01 14:29:33.105 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:29:33 compute-0 nova_compute[192698]: 2025-10-01 14:29:33.107 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:29:33 compute-0 nova_compute[192698]: 2025-10-01 14:29:33.111 2 INFO os_vif [None req-8c8b038c-54ba-432b-9612-834318b40830 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:4a:89:04,bridge_name='br-int',has_traffic_filtering=True,id=e6b4fe04-03d3-4ff9-b41e-73373dfb2f25,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape6b4fe04-03')
Oct 01 14:29:33 compute-0 nova_compute[192698]: 2025-10-01 14:29:33.112 2 INFO nova.virt.libvirt.driver [None req-8c8b038c-54ba-432b-9612-834318b40830 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 8668fad5-f310-4dec-9960-6e26c28db100] Deleting instance files /var/lib/nova/instances/8668fad5-f310-4dec-9960-6e26c28db100_del
Oct 01 14:29:33 compute-0 nova_compute[192698]: 2025-10-01 14:29:33.113 2 INFO nova.virt.libvirt.driver [None req-8c8b038c-54ba-432b-9612-834318b40830 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 8668fad5-f310-4dec-9960-6e26c28db100] Deletion of /var/lib/nova/instances/8668fad5-f310-4dec-9960-6e26c28db100_del complete
Oct 01 14:29:33 compute-0 nova_compute[192698]: 2025-10-01 14:29:33.133 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Applying migration context for instance 247a480d-9e4d-4832-9646-d263b8a3035e as it has an incoming, in-progress migration 4c4ee9c7-ac0d-4287-b41d-f72d92074886. Migration status is running _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1046
Oct 01 14:29:33 compute-0 nova_compute[192698]: 2025-10-01 14:29:33.133 2 DEBUG nova.objects.instance [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] [instance: 247a480d-9e4d-4832-9646-d263b8a3035e] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Oct 01 14:29:33 compute-0 nova_compute[192698]: 2025-10-01 14:29:33.630 2 INFO nova.compute.manager [None req-8c8b038c-54ba-432b-9612-834318b40830 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 8668fad5-f310-4dec-9960-6e26c28db100] Took 1.34 seconds to destroy the instance on the hypervisor.
Oct 01 14:29:33 compute-0 nova_compute[192698]: 2025-10-01 14:29:33.632 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-8c8b038c-54ba-432b-9612-834318b40830 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 01 14:29:33 compute-0 nova_compute[192698]: 2025-10-01 14:29:33.633 2 DEBUG nova.compute.manager [-] [instance: 8668fad5-f310-4dec-9960-6e26c28db100] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 01 14:29:33 compute-0 nova_compute[192698]: 2025-10-01 14:29:33.633 2 DEBUG nova.network.neutron [-] [instance: 8668fad5-f310-4dec-9960-6e26c28db100] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 01 14:29:33 compute-0 nova_compute[192698]: 2025-10-01 14:29:33.634 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:29:34 compute-0 nova_compute[192698]: 2025-10-01 14:29:34.150 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] [instance: 247a480d-9e4d-4832-9646-d263b8a3035e] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Oct 01 14:29:34 compute-0 nova_compute[192698]: 2025-10-01 14:29:34.182 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Instance 247a480d-9e4d-4832-9646-d263b8a3035e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Oct 01 14:29:34 compute-0 nova_compute[192698]: 2025-10-01 14:29:34.183 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Instance 8668fad5-f310-4dec-9960-6e26c28db100 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Oct 01 14:29:34 compute-0 nova_compute[192698]: 2025-10-01 14:29:34.183 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 01 14:29:34 compute-0 nova_compute[192698]: 2025-10-01 14:29:34.183 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:29:32 up  1:28,  0 user,  load average: 0.07, 0.18, 0.26\n', 'num_instances': '2', 'num_vm_active': '2', 'num_task_None': '1', 'num_os_type_None': '2', 'num_proj_d43115e3729442e1b68b749acc0dabc8': '2', 'io_workload': '0', 'num_task_deleting': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 01 14:29:34 compute-0 nova_compute[192698]: 2025-10-01 14:29:34.296 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:29:34 compute-0 sshd-session[226532]: Failed password for root from 193.46.255.99 port 50144 ssh2
Oct 01 14:29:34 compute-0 nova_compute[192698]: 2025-10-01 14:29:34.546 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:29:34 compute-0 nova_compute[192698]: 2025-10-01 14:29:34.717 2 DEBUG nova.compute.manager [req-b45e6a73-131f-4c48-bfdd-fc0815d83bb8 req-8bd0c9b2-a5e1-48d7-a26b-29cee8304000 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 8668fad5-f310-4dec-9960-6e26c28db100] Received event network-vif-unplugged-e6b4fe04-03d3-4ff9-b41e-73373dfb2f25 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:29:34 compute-0 nova_compute[192698]: 2025-10-01 14:29:34.718 2 DEBUG oslo_concurrency.lockutils [req-b45e6a73-131f-4c48-bfdd-fc0815d83bb8 req-8bd0c9b2-a5e1-48d7-a26b-29cee8304000 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "8668fad5-f310-4dec-9960-6e26c28db100-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:29:34 compute-0 nova_compute[192698]: 2025-10-01 14:29:34.718 2 DEBUG oslo_concurrency.lockutils [req-b45e6a73-131f-4c48-bfdd-fc0815d83bb8 req-8bd0c9b2-a5e1-48d7-a26b-29cee8304000 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "8668fad5-f310-4dec-9960-6e26c28db100-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:29:34 compute-0 nova_compute[192698]: 2025-10-01 14:29:34.718 2 DEBUG oslo_concurrency.lockutils [req-b45e6a73-131f-4c48-bfdd-fc0815d83bb8 req-8bd0c9b2-a5e1-48d7-a26b-29cee8304000 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "8668fad5-f310-4dec-9960-6e26c28db100-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:29:34 compute-0 nova_compute[192698]: 2025-10-01 14:29:34.718 2 DEBUG nova.compute.manager [req-b45e6a73-131f-4c48-bfdd-fc0815d83bb8 req-8bd0c9b2-a5e1-48d7-a26b-29cee8304000 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 8668fad5-f310-4dec-9960-6e26c28db100] No waiting events found dispatching network-vif-unplugged-e6b4fe04-03d3-4ff9-b41e-73373dfb2f25 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:29:34 compute-0 nova_compute[192698]: 2025-10-01 14:29:34.719 2 DEBUG nova.compute.manager [req-b45e6a73-131f-4c48-bfdd-fc0815d83bb8 req-8bd0c9b2-a5e1-48d7-a26b-29cee8304000 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 8668fad5-f310-4dec-9960-6e26c28db100] Received event network-vif-unplugged-e6b4fe04-03d3-4ff9-b41e-73373dfb2f25 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 01 14:29:34 compute-0 nova_compute[192698]: 2025-10-01 14:29:34.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:29:34 compute-0 nova_compute[192698]: 2025-10-01 14:29:34.804 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:29:34 compute-0 unix_chkpwd[226584]: password check failed for user (root)
Oct 01 14:29:35 compute-0 podman[226585]: 2025-10-01 14:29:35.170893319 +0000 UTC m=+0.080977940 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, maintainer=Red Hat, Inc., config_id=edpm, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, name=ubi9-minimal, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct 01 14:29:35 compute-0 nova_compute[192698]: 2025-10-01 14:29:35.316 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 01 14:29:35 compute-0 nova_compute[192698]: 2025-10-01 14:29:35.316 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.208s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:29:35 compute-0 nova_compute[192698]: 2025-10-01 14:29:35.365 2 DEBUG nova.compute.manager [req-b4af391e-bba0-4986-8b4d-ebd68e1ae110 req-e10c7c06-4832-4158-b05b-086cfbad2194 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 8668fad5-f310-4dec-9960-6e26c28db100] Received event network-vif-deleted-e6b4fe04-03d3-4ff9-b41e-73373dfb2f25 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:29:35 compute-0 nova_compute[192698]: 2025-10-01 14:29:35.366 2 INFO nova.compute.manager [req-b4af391e-bba0-4986-8b4d-ebd68e1ae110 req-e10c7c06-4832-4158-b05b-086cfbad2194 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 8668fad5-f310-4dec-9960-6e26c28db100] Neutron deleted interface e6b4fe04-03d3-4ff9-b41e-73373dfb2f25; detaching it from the instance and deleting it from the info cache
Oct 01 14:29:35 compute-0 nova_compute[192698]: 2025-10-01 14:29:35.366 2 DEBUG nova.network.neutron [req-b4af391e-bba0-4986-8b4d-ebd68e1ae110 req-e10c7c06-4832-4158-b05b-086cfbad2194 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 8668fad5-f310-4dec-9960-6e26c28db100] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:29:35 compute-0 nova_compute[192698]: 2025-10-01 14:29:35.793 2 DEBUG nova.network.neutron [-] [instance: 8668fad5-f310-4dec-9960-6e26c28db100] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:29:35 compute-0 nova_compute[192698]: 2025-10-01 14:29:35.874 2 DEBUG nova.compute.manager [req-b4af391e-bba0-4986-8b4d-ebd68e1ae110 req-e10c7c06-4832-4158-b05b-086cfbad2194 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 8668fad5-f310-4dec-9960-6e26c28db100] Detach interface failed, port_id=e6b4fe04-03d3-4ff9-b41e-73373dfb2f25, reason: Instance 8668fad5-f310-4dec-9960-6e26c28db100 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Oct 01 14:29:36 compute-0 nova_compute[192698]: 2025-10-01 14:29:36.299 2 INFO nova.compute.manager [-] [instance: 8668fad5-f310-4dec-9960-6e26c28db100] Took 2.67 seconds to deallocate network for instance.
Oct 01 14:29:36 compute-0 nova_compute[192698]: 2025-10-01 14:29:36.319 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:29:36 compute-0 sshd-session[226532]: Failed password for root from 193.46.255.99 port 50144 ssh2
Oct 01 14:29:36 compute-0 nova_compute[192698]: 2025-10-01 14:29:36.827 2 DEBUG oslo_concurrency.lockutils [None req-8c8b038c-54ba-432b-9612-834318b40830 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:29:36 compute-0 nova_compute[192698]: 2025-10-01 14:29:36.827 2 DEBUG oslo_concurrency.lockutils [None req-8c8b038c-54ba-432b-9612-834318b40830 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:29:36 compute-0 nova_compute[192698]: 2025-10-01 14:29:36.834 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:29:36 compute-0 nova_compute[192698]: 2025-10-01 14:29:36.834 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:29:36 compute-0 nova_compute[192698]: 2025-10-01 14:29:36.953 2 DEBUG nova.compute.provider_tree [None req-8c8b038c-54ba-432b-9612-834318b40830 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:29:36 compute-0 unix_chkpwd[226608]: password check failed for user (root)
Oct 01 14:29:36 compute-0 sshd-session[226582]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=101.47.181.100  user=root
Oct 01 14:29:37 compute-0 nova_compute[192698]: 2025-10-01 14:29:37.461 2 DEBUG nova.scheduler.client.report [None req-8c8b038c-54ba-432b-9612-834318b40830 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:29:37 compute-0 sshd-session[226532]: Received disconnect from 193.46.255.99 port 50144:11:  [preauth]
Oct 01 14:29:37 compute-0 sshd-session[226532]: Disconnected from authenticating user root 193.46.255.99 port 50144 [preauth]
Oct 01 14:29:37 compute-0 sshd-session[226532]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.99  user=root
Oct 01 14:29:37 compute-0 nova_compute[192698]: 2025-10-01 14:29:37.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:29:37 compute-0 nova_compute[192698]: 2025-10-01 14:29:37.926 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:29:37 compute-0 nova_compute[192698]: 2025-10-01 14:29:37.926 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:29:37 compute-0 nova_compute[192698]: 2025-10-01 14:29:37.926 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 01 14:29:37 compute-0 nova_compute[192698]: 2025-10-01 14:29:37.970 2 DEBUG oslo_concurrency.lockutils [None req-8c8b038c-54ba-432b-9612-834318b40830 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.143s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:29:38 compute-0 nova_compute[192698]: 2025-10-01 14:29:38.002 2 INFO nova.scheduler.client.report [None req-8c8b038c-54ba-432b-9612-834318b40830 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Deleted allocations for instance 8668fad5-f310-4dec-9960-6e26c28db100
Oct 01 14:29:38 compute-0 nova_compute[192698]: 2025-10-01 14:29:38.107 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:29:38 compute-0 unix_chkpwd[226611]: password check failed for user (root)
Oct 01 14:29:38 compute-0 sshd-session[226609]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.99  user=root
Oct 01 14:29:39 compute-0 nova_compute[192698]: 2025-10-01 14:29:39.034 2 DEBUG oslo_concurrency.lockutils [None req-8c8b038c-54ba-432b-9612-834318b40830 f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "8668fad5-f310-4dec-9960-6e26c28db100" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.293s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:29:39 compute-0 sshd-session[226582]: Failed password for root from 101.47.181.100 port 33380 ssh2
Oct 01 14:29:39 compute-0 nova_compute[192698]: 2025-10-01 14:29:39.743 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:29:39 compute-0 sshd-session[226582]: Connection closed by authenticating user root 101.47.181.100 port 33380 [preauth]
Oct 01 14:29:40 compute-0 nova_compute[192698]: 2025-10-01 14:29:40.222 2 DEBUG oslo_concurrency.lockutils [None req-1c8e9de9-fa22-4f1e-a6d9-afdcc82b7dbc f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Acquiring lock "247a480d-9e4d-4832-9646-d263b8a3035e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:29:40 compute-0 nova_compute[192698]: 2025-10-01 14:29:40.222 2 DEBUG oslo_concurrency.lockutils [None req-1c8e9de9-fa22-4f1e-a6d9-afdcc82b7dbc f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "247a480d-9e4d-4832-9646-d263b8a3035e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:29:40 compute-0 nova_compute[192698]: 2025-10-01 14:29:40.223 2 DEBUG oslo_concurrency.lockutils [None req-1c8e9de9-fa22-4f1e-a6d9-afdcc82b7dbc f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Acquiring lock "247a480d-9e4d-4832-9646-d263b8a3035e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:29:40 compute-0 nova_compute[192698]: 2025-10-01 14:29:40.223 2 DEBUG oslo_concurrency.lockutils [None req-1c8e9de9-fa22-4f1e-a6d9-afdcc82b7dbc f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "247a480d-9e4d-4832-9646-d263b8a3035e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:29:40 compute-0 nova_compute[192698]: 2025-10-01 14:29:40.223 2 DEBUG oslo_concurrency.lockutils [None req-1c8e9de9-fa22-4f1e-a6d9-afdcc82b7dbc f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "247a480d-9e4d-4832-9646-d263b8a3035e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:29:40 compute-0 nova_compute[192698]: 2025-10-01 14:29:40.236 2 INFO nova.compute.manager [None req-1c8e9de9-fa22-4f1e-a6d9-afdcc82b7dbc f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 247a480d-9e4d-4832-9646-d263b8a3035e] Terminating instance
Oct 01 14:29:40 compute-0 nova_compute[192698]: 2025-10-01 14:29:40.753 2 DEBUG nova.compute.manager [None req-1c8e9de9-fa22-4f1e-a6d9-afdcc82b7dbc f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 247a480d-9e4d-4832-9646-d263b8a3035e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 01 14:29:40 compute-0 kernel: tap646ff317-9a (unregistering): left promiscuous mode
Oct 01 14:29:40 compute-0 NetworkManager[51741]: <info>  [1759328980.8013] device (tap646ff317-9a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 01 14:29:40 compute-0 ovn_controller[94909]: 2025-10-01T14:29:40Z|00226|binding|INFO|Releasing lport 646ff317-9aed-4b52-80f0-ae16e4a76056 from this chassis (sb_readonly=0)
Oct 01 14:29:40 compute-0 ovn_controller[94909]: 2025-10-01T14:29:40Z|00227|binding|INFO|Setting lport 646ff317-9aed-4b52-80f0-ae16e4a76056 down in Southbound
Oct 01 14:29:40 compute-0 nova_compute[192698]: 2025-10-01 14:29:40.809 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:29:40 compute-0 ovn_controller[94909]: 2025-10-01T14:29:40Z|00228|binding|INFO|Removing iface tap646ff317-9a ovn-installed in OVS
Oct 01 14:29:40 compute-0 nova_compute[192698]: 2025-10-01 14:29:40.813 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:29:40 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:29:40.819 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f1:ea:d0 10.100.0.11'], port_security=['fa:16:3e:f1:ea:d0 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '247a480d-9e4d-4832-9646-d263b8a3035e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-031a8987-8430-4fb6-a464-01e4dca2fae7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd43115e3729442e1b68b749acc0dabc8', 'neutron:revision_number': '15', 'neutron:security_group_ids': '43a3232d-93b1-43af-a9a3-1fde49b4460d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bd1914da-f1b0-4097-9d6b-24a3870871dc, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], logical_port=646ff317-9aed-4b52-80f0-ae16e4a76056) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:29:40 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:29:40.820 103791 INFO neutron.agent.ovn.metadata.agent [-] Port 646ff317-9aed-4b52-80f0-ae16e4a76056 in datapath 031a8987-8430-4fb6-a464-01e4dca2fae7 unbound from our chassis
Oct 01 14:29:40 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:29:40.821 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 031a8987-8430-4fb6-a464-01e4dca2fae7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 01 14:29:40 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:29:40.822 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[4665c614-f694-4bfc-bb94-ea0ba8b0d60b]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:29:40 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:29:40.823 103791 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7 namespace which is not needed anymore
Oct 01 14:29:40 compute-0 nova_compute[192698]: 2025-10-01 14:29:40.841 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:29:40 compute-0 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d0000001a.scope: Deactivated successfully.
Oct 01 14:29:40 compute-0 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d0000001a.scope: Consumed 2.616s CPU time.
Oct 01 14:29:40 compute-0 systemd-machined[152704]: Machine qemu-21-instance-0000001a terminated.
Oct 01 14:29:40 compute-0 neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7[226238]: [NOTICE]   (226242) : haproxy version is 3.0.5-8e879a5
Oct 01 14:29:40 compute-0 neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7[226238]: [NOTICE]   (226242) : path to executable is /usr/sbin/haproxy
Oct 01 14:29:40 compute-0 neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7[226238]: [WARNING]  (226242) : Exiting Master process...
Oct 01 14:29:40 compute-0 podman[226635]: 2025-10-01 14:29:40.955460826 +0000 UTC m=+0.026712909 container kill e2622e4de2c6b6f060b8cd000e7eac4774f131de039b8b6ad1cd9dfec54fdbe7 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct 01 14:29:40 compute-0 neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7[226238]: [ALERT]    (226242) : Current worker (226244) exited with code 143 (Terminated)
Oct 01 14:29:40 compute-0 neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7[226238]: [WARNING]  (226242) : All workers exited. Exiting... (0)
Oct 01 14:29:40 compute-0 systemd[1]: libpod-e2622e4de2c6b6f060b8cd000e7eac4774f131de039b8b6ad1cd9dfec54fdbe7.scope: Deactivated successfully.
Oct 01 14:29:40 compute-0 conmon[226238]: conmon e2622e4de2c6b6f060b8 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-e2622e4de2c6b6f060b8cd000e7eac4774f131de039b8b6ad1cd9dfec54fdbe7.scope/container/memory.events
Oct 01 14:29:40 compute-0 nova_compute[192698]: 2025-10-01 14:29:40.985 2 DEBUG nova.compute.manager [req-3a5287f5-71fc-447c-9f95-2a59734bb31a req-070e8b81-0a58-49f6-8eaa-141e3d5356fd 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 247a480d-9e4d-4832-9646-d263b8a3035e] Received event network-vif-unplugged-646ff317-9aed-4b52-80f0-ae16e4a76056 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:29:40 compute-0 nova_compute[192698]: 2025-10-01 14:29:40.985 2 DEBUG oslo_concurrency.lockutils [req-3a5287f5-71fc-447c-9f95-2a59734bb31a req-070e8b81-0a58-49f6-8eaa-141e3d5356fd 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "247a480d-9e4d-4832-9646-d263b8a3035e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:29:40 compute-0 nova_compute[192698]: 2025-10-01 14:29:40.985 2 DEBUG oslo_concurrency.lockutils [req-3a5287f5-71fc-447c-9f95-2a59734bb31a req-070e8b81-0a58-49f6-8eaa-141e3d5356fd 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "247a480d-9e4d-4832-9646-d263b8a3035e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:29:40 compute-0 nova_compute[192698]: 2025-10-01 14:29:40.986 2 DEBUG oslo_concurrency.lockutils [req-3a5287f5-71fc-447c-9f95-2a59734bb31a req-070e8b81-0a58-49f6-8eaa-141e3d5356fd 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "247a480d-9e4d-4832-9646-d263b8a3035e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:29:40 compute-0 nova_compute[192698]: 2025-10-01 14:29:40.986 2 DEBUG nova.compute.manager [req-3a5287f5-71fc-447c-9f95-2a59734bb31a req-070e8b81-0a58-49f6-8eaa-141e3d5356fd 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 247a480d-9e4d-4832-9646-d263b8a3035e] No waiting events found dispatching network-vif-unplugged-646ff317-9aed-4b52-80f0-ae16e4a76056 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:29:40 compute-0 nova_compute[192698]: 2025-10-01 14:29:40.986 2 DEBUG nova.compute.manager [req-3a5287f5-71fc-447c-9f95-2a59734bb31a req-070e8b81-0a58-49f6-8eaa-141e3d5356fd 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 247a480d-9e4d-4832-9646-d263b8a3035e] Received event network-vif-unplugged-646ff317-9aed-4b52-80f0-ae16e4a76056 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 01 14:29:40 compute-0 podman[226650]: 2025-10-01 14:29:40.998255458 +0000 UTC m=+0.026607457 container died e2622e4de2c6b6f060b8cd000e7eac4774f131de039b8b6ad1cd9dfec54fdbe7 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct 01 14:29:41 compute-0 sshd-session[226609]: Failed password for root from 193.46.255.99 port 40682 ssh2
Oct 01 14:29:41 compute-0 nova_compute[192698]: 2025-10-01 14:29:41.021 2 INFO nova.virt.libvirt.driver [-] [instance: 247a480d-9e4d-4832-9646-d263b8a3035e] Instance destroyed successfully.
Oct 01 14:29:41 compute-0 nova_compute[192698]: 2025-10-01 14:29:41.021 2 DEBUG nova.objects.instance [None req-1c8e9de9-fa22-4f1e-a6d9-afdcc82b7dbc f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lazy-loading 'resources' on Instance uuid 247a480d-9e4d-4832-9646-d263b8a3035e obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:29:41 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e2622e4de2c6b6f060b8cd000e7eac4774f131de039b8b6ad1cd9dfec54fdbe7-userdata-shm.mount: Deactivated successfully.
Oct 01 14:29:41 compute-0 systemd[1]: var-lib-containers-storage-overlay-b0cc31d06f1f31d31eb9057c73e160bf957b728979155ad525ce92c047b7c24b-merged.mount: Deactivated successfully.
Oct 01 14:29:41 compute-0 podman[226650]: 2025-10-01 14:29:41.054636075 +0000 UTC m=+0.082988034 container cleanup e2622e4de2c6b6f060b8cd000e7eac4774f131de039b8b6ad1cd9dfec54fdbe7 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest)
Oct 01 14:29:41 compute-0 systemd[1]: libpod-conmon-e2622e4de2c6b6f060b8cd000e7eac4774f131de039b8b6ad1cd9dfec54fdbe7.scope: Deactivated successfully.
Oct 01 14:29:41 compute-0 podman[226654]: 2025-10-01 14:29:41.106442648 +0000 UTC m=+0.116845274 container remove e2622e4de2c6b6f060b8cd000e7eac4774f131de039b8b6ad1cd9dfec54fdbe7 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.build-date=20250930)
Oct 01 14:29:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:29:41.115 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[8054d713-31f1-4f34-bd4f-5a2d06c2ddcc]: (4, ("Wed Oct  1 02:29:40 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7 (e2622e4de2c6b6f060b8cd000e7eac4774f131de039b8b6ad1cd9dfec54fdbe7)\ne2622e4de2c6b6f060b8cd000e7eac4774f131de039b8b6ad1cd9dfec54fdbe7\nWed Oct  1 02:29:40 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7 (e2622e4de2c6b6f060b8cd000e7eac4774f131de039b8b6ad1cd9dfec54fdbe7)\ne2622e4de2c6b6f060b8cd000e7eac4774f131de039b8b6ad1cd9dfec54fdbe7\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:29:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:29:41.117 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[b6cc8bba-e83f-462e-a918-f4d34cf80cd4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:29:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:29:41.118 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/031a8987-8430-4fb6-a464-01e4dca2fae7.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:29:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:29:41.118 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[027905f3-8068-4e33-adf7-8da0f17edbc6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:29:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:29:41.119 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap031a8987-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:29:41 compute-0 nova_compute[192698]: 2025-10-01 14:29:41.121 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:29:41 compute-0 kernel: tap031a8987-80: left promiscuous mode
Oct 01 14:29:41 compute-0 nova_compute[192698]: 2025-10-01 14:29:41.150 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:29:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:29:41.153 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[fdacd7e3-8e1a-4ee2-a14f-fe23f9a1b1f2]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:29:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:29:41.180 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[bd828ad3-df28-4513-b0ee-2281008fb8c5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:29:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:29:41.182 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[c1d7f8e1-7114-4e67-8920-7a8d343212ee]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:29:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:29:41.199 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[07d71314-2d6a-4a9d-9860-f2fe37b4c1a8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 528947, 'reachable_time': 39591, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226699, 'error': None, 'target': 'ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:29:41 compute-0 systemd[1]: run-netns-ovnmeta\x2d031a8987\x2d8430\x2d4fb6\x2da464\x2d01e4dca2fae7.mount: Deactivated successfully.
Oct 01 14:29:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:29:41.202 103910 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-031a8987-8430-4fb6-a464-01e4dca2fae7 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Oct 01 14:29:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:29:41.203 103910 DEBUG oslo.privsep.daemon [-] privsep: reply[8ae0f41b-a80c-4cb3-ac79-2b6a0f66f2bf]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:29:41 compute-0 podman[226701]: 2025-10-01 14:29:41.307724863 +0000 UTC m=+0.070096976 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.build-date=20250930)
Oct 01 14:29:41 compute-0 podman[226700]: 2025-10-01 14:29:41.31167311 +0000 UTC m=+0.073964111 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20250930, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Oct 01 14:29:41 compute-0 unix_chkpwd[226739]: password check failed for user (root)
Oct 01 14:29:41 compute-0 nova_compute[192698]: 2025-10-01 14:29:41.529 2 DEBUG nova.virt.libvirt.vif [None req-1c8e9de9-fa22-4f1e-a6d9-afdcc82b7dbc f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-10-01T14:27:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1888530',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1888530',id=26,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-01T14:27:42Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d43115e3729442e1b68b749acc0dabc8',ramdisk_id='',reservation_id='r-rgo0d6jh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,manager,member',clean_attempts='1',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-30131345',owner_user_name='tempest-TestExecuteStrategies-30131345-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-01T14:29:28Z,user_data=None,user_id='f8897741e6ca4770b56d28d05fa3fc42',uuid=247a480d-9e4d-4832-9646-d263b8a3035e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "646ff317-9aed-4b52-80f0-ae16e4a76056", "address": "fa:16:3e:f1:ea:d0", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap646ff317-9a", "ovs_interfaceid": "646ff317-9aed-4b52-80f0-ae16e4a76056", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 01 14:29:41 compute-0 nova_compute[192698]: 2025-10-01 14:29:41.530 2 DEBUG nova.network.os_vif_util [None req-1c8e9de9-fa22-4f1e-a6d9-afdcc82b7dbc f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Converting VIF {"id": "646ff317-9aed-4b52-80f0-ae16e4a76056", "address": "fa:16:3e:f1:ea:d0", "network": {"id": "031a8987-8430-4fb6-a464-01e4dca2fae7", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1415110967-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5c9696bee230443aa9465a892b11ae6b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap646ff317-9a", "ovs_interfaceid": "646ff317-9aed-4b52-80f0-ae16e4a76056", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:29:41 compute-0 nova_compute[192698]: 2025-10-01 14:29:41.532 2 DEBUG nova.network.os_vif_util [None req-1c8e9de9-fa22-4f1e-a6d9-afdcc82b7dbc f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f1:ea:d0,bridge_name='br-int',has_traffic_filtering=True,id=646ff317-9aed-4b52-80f0-ae16e4a76056,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap646ff317-9a') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:29:41 compute-0 nova_compute[192698]: 2025-10-01 14:29:41.532 2 DEBUG os_vif [None req-1c8e9de9-fa22-4f1e-a6d9-afdcc82b7dbc f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f1:ea:d0,bridge_name='br-int',has_traffic_filtering=True,id=646ff317-9aed-4b52-80f0-ae16e4a76056,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap646ff317-9a') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 01 14:29:41 compute-0 nova_compute[192698]: 2025-10-01 14:29:41.535 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:29:41 compute-0 nova_compute[192698]: 2025-10-01 14:29:41.535 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap646ff317-9a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:29:41 compute-0 nova_compute[192698]: 2025-10-01 14:29:41.537 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:29:41 compute-0 nova_compute[192698]: 2025-10-01 14:29:41.539 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:29:41 compute-0 nova_compute[192698]: 2025-10-01 14:29:41.540 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:29:41 compute-0 nova_compute[192698]: 2025-10-01 14:29:41.541 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:29:41 compute-0 nova_compute[192698]: 2025-10-01 14:29:41.541 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=e16d1862-63c3-4640-b8d5-cabd149320d8) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:29:41 compute-0 nova_compute[192698]: 2025-10-01 14:29:41.542 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:29:41 compute-0 nova_compute[192698]: 2025-10-01 14:29:41.544 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:29:41 compute-0 nova_compute[192698]: 2025-10-01 14:29:41.546 2 INFO os_vif [None req-1c8e9de9-fa22-4f1e-a6d9-afdcc82b7dbc f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f1:ea:d0,bridge_name='br-int',has_traffic_filtering=True,id=646ff317-9aed-4b52-80f0-ae16e4a76056,network=Network(031a8987-8430-4fb6-a464-01e4dca2fae7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap646ff317-9a')
Oct 01 14:29:41 compute-0 nova_compute[192698]: 2025-10-01 14:29:41.547 2 INFO nova.virt.libvirt.driver [None req-1c8e9de9-fa22-4f1e-a6d9-afdcc82b7dbc f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 247a480d-9e4d-4832-9646-d263b8a3035e] Deleting instance files /var/lib/nova/instances/247a480d-9e4d-4832-9646-d263b8a3035e_del
Oct 01 14:29:41 compute-0 nova_compute[192698]: 2025-10-01 14:29:41.548 2 INFO nova.virt.libvirt.driver [None req-1c8e9de9-fa22-4f1e-a6d9-afdcc82b7dbc f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 247a480d-9e4d-4832-9646-d263b8a3035e] Deletion of /var/lib/nova/instances/247a480d-9e4d-4832-9646-d263b8a3035e_del complete
Oct 01 14:29:42 compute-0 nova_compute[192698]: 2025-10-01 14:29:42.066 2 INFO nova.compute.manager [None req-1c8e9de9-fa22-4f1e-a6d9-afdcc82b7dbc f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] [instance: 247a480d-9e4d-4832-9646-d263b8a3035e] Took 1.31 seconds to destroy the instance on the hypervisor.
Oct 01 14:29:42 compute-0 nova_compute[192698]: 2025-10-01 14:29:42.067 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-1c8e9de9-fa22-4f1e-a6d9-afdcc82b7dbc f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 01 14:29:42 compute-0 nova_compute[192698]: 2025-10-01 14:29:42.067 2 DEBUG nova.compute.manager [-] [instance: 247a480d-9e4d-4832-9646-d263b8a3035e] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 01 14:29:42 compute-0 nova_compute[192698]: 2025-10-01 14:29:42.068 2 DEBUG nova.network.neutron [-] [instance: 247a480d-9e4d-4832-9646-d263b8a3035e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 01 14:29:42 compute-0 nova_compute[192698]: 2025-10-01 14:29:42.068 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:29:42 compute-0 nova_compute[192698]: 2025-10-01 14:29:42.549 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:29:43 compute-0 nova_compute[192698]: 2025-10-01 14:29:43.004 2 DEBUG nova.compute.manager [req-bf739420-2d46-4d14-928d-ed58c53a15bf req-d058e405-fe78-46e7-862f-b9f2408af610 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 247a480d-9e4d-4832-9646-d263b8a3035e] Received event network-vif-deleted-646ff317-9aed-4b52-80f0-ae16e4a76056 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:29:43 compute-0 nova_compute[192698]: 2025-10-01 14:29:43.005 2 INFO nova.compute.manager [req-bf739420-2d46-4d14-928d-ed58c53a15bf req-d058e405-fe78-46e7-862f-b9f2408af610 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 247a480d-9e4d-4832-9646-d263b8a3035e] Neutron deleted interface 646ff317-9aed-4b52-80f0-ae16e4a76056; detaching it from the instance and deleting it from the info cache
Oct 01 14:29:43 compute-0 nova_compute[192698]: 2025-10-01 14:29:43.006 2 DEBUG nova.network.neutron [req-bf739420-2d46-4d14-928d-ed58c53a15bf req-d058e405-fe78-46e7-862f-b9f2408af610 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 247a480d-9e4d-4832-9646-d263b8a3035e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:29:43 compute-0 nova_compute[192698]: 2025-10-01 14:29:43.056 2 DEBUG nova.compute.manager [req-99545596-08ae-4679-9db8-23694343af65 req-65dada1b-95aa-4116-98ac-d04c3afc6122 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 247a480d-9e4d-4832-9646-d263b8a3035e] Received event network-vif-unplugged-646ff317-9aed-4b52-80f0-ae16e4a76056 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:29:43 compute-0 nova_compute[192698]: 2025-10-01 14:29:43.057 2 DEBUG oslo_concurrency.lockutils [req-99545596-08ae-4679-9db8-23694343af65 req-65dada1b-95aa-4116-98ac-d04c3afc6122 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "247a480d-9e4d-4832-9646-d263b8a3035e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:29:43 compute-0 nova_compute[192698]: 2025-10-01 14:29:43.058 2 DEBUG oslo_concurrency.lockutils [req-99545596-08ae-4679-9db8-23694343af65 req-65dada1b-95aa-4116-98ac-d04c3afc6122 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "247a480d-9e4d-4832-9646-d263b8a3035e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:29:43 compute-0 nova_compute[192698]: 2025-10-01 14:29:43.058 2 DEBUG oslo_concurrency.lockutils [req-99545596-08ae-4679-9db8-23694343af65 req-65dada1b-95aa-4116-98ac-d04c3afc6122 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "247a480d-9e4d-4832-9646-d263b8a3035e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:29:43 compute-0 nova_compute[192698]: 2025-10-01 14:29:43.058 2 DEBUG nova.compute.manager [req-99545596-08ae-4679-9db8-23694343af65 req-65dada1b-95aa-4116-98ac-d04c3afc6122 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 247a480d-9e4d-4832-9646-d263b8a3035e] No waiting events found dispatching network-vif-unplugged-646ff317-9aed-4b52-80f0-ae16e4a76056 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:29:43 compute-0 nova_compute[192698]: 2025-10-01 14:29:43.059 2 DEBUG nova.compute.manager [req-99545596-08ae-4679-9db8-23694343af65 req-65dada1b-95aa-4116-98ac-d04c3afc6122 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 247a480d-9e4d-4832-9646-d263b8a3035e] Received event network-vif-unplugged-646ff317-9aed-4b52-80f0-ae16e4a76056 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 01 14:29:43 compute-0 unix_chkpwd[226740]: password check failed for user (root)
Oct 01 14:29:43 compute-0 sshd-session[226612]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=101.47.181.100  user=root
Oct 01 14:29:43 compute-0 nova_compute[192698]: 2025-10-01 14:29:43.423 2 DEBUG nova.network.neutron [-] [instance: 247a480d-9e4d-4832-9646-d263b8a3035e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:29:43 compute-0 nova_compute[192698]: 2025-10-01 14:29:43.513 2 DEBUG nova.compute.manager [req-bf739420-2d46-4d14-928d-ed58c53a15bf req-d058e405-fe78-46e7-862f-b9f2408af610 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 247a480d-9e4d-4832-9646-d263b8a3035e] Detach interface failed, port_id=646ff317-9aed-4b52-80f0-ae16e4a76056, reason: Instance 247a480d-9e4d-4832-9646-d263b8a3035e could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Oct 01 14:29:43 compute-0 sshd-session[226609]: Failed password for root from 193.46.255.99 port 40682 ssh2
Oct 01 14:29:43 compute-0 nova_compute[192698]: 2025-10-01 14:29:43.930 2 INFO nova.compute.manager [-] [instance: 247a480d-9e4d-4832-9646-d263b8a3035e] Took 1.86 seconds to deallocate network for instance.
Oct 01 14:29:44 compute-0 unix_chkpwd[226741]: password check failed for user (root)
Oct 01 14:29:44 compute-0 nova_compute[192698]: 2025-10-01 14:29:44.451 2 DEBUG oslo_concurrency.lockutils [None req-1c8e9de9-fa22-4f1e-a6d9-afdcc82b7dbc f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:29:44 compute-0 nova_compute[192698]: 2025-10-01 14:29:44.452 2 DEBUG oslo_concurrency.lockutils [None req-1c8e9de9-fa22-4f1e-a6d9-afdcc82b7dbc f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:29:44 compute-0 nova_compute[192698]: 2025-10-01 14:29:44.501 2 DEBUG nova.compute.provider_tree [None req-1c8e9de9-fa22-4f1e-a6d9-afdcc82b7dbc f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:29:44 compute-0 nova_compute[192698]: 2025-10-01 14:29:44.746 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:29:45 compute-0 nova_compute[192698]: 2025-10-01 14:29:45.065 2 DEBUG nova.scheduler.client.report [None req-1c8e9de9-fa22-4f1e-a6d9-afdcc82b7dbc f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:29:45 compute-0 nova_compute[192698]: 2025-10-01 14:29:45.577 2 DEBUG oslo_concurrency.lockutils [None req-1c8e9de9-fa22-4f1e-a6d9-afdcc82b7dbc f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.125s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:29:45 compute-0 nova_compute[192698]: 2025-10-01 14:29:45.605 2 INFO nova.scheduler.client.report [None req-1c8e9de9-fa22-4f1e-a6d9-afdcc82b7dbc f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Deleted allocations for instance 247a480d-9e4d-4832-9646-d263b8a3035e
Oct 01 14:29:45 compute-0 sshd-session[226612]: Failed password for root from 101.47.181.100 port 38650 ssh2
Oct 01 14:29:46 compute-0 sshd-session[226609]: Failed password for root from 193.46.255.99 port 40682 ssh2
Oct 01 14:29:46 compute-0 nova_compute[192698]: 2025-10-01 14:29:46.544 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:29:46 compute-0 nova_compute[192698]: 2025-10-01 14:29:46.689 2 DEBUG oslo_concurrency.lockutils [None req-1c8e9de9-fa22-4f1e-a6d9-afdcc82b7dbc f8897741e6ca4770b56d28d05fa3fc42 d43115e3729442e1b68b749acc0dabc8 - - default default] Lock "247a480d-9e4d-4832-9646-d263b8a3035e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.467s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:29:47 compute-0 sshd-session[226609]: Received disconnect from 193.46.255.99 port 40682:11:  [preauth]
Oct 01 14:29:47 compute-0 sshd-session[226609]: Disconnected from authenticating user root 193.46.255.99 port 40682 [preauth]
Oct 01 14:29:47 compute-0 sshd-session[226609]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.99  user=root
Oct 01 14:29:47 compute-0 unix_chkpwd[226744]: password check failed for user (root)
Oct 01 14:29:47 compute-0 sshd-session[226742]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.99  user=root
Oct 01 14:29:48 compute-0 podman[226745]: 2025-10-01 14:29:48.167939569 +0000 UTC m=+0.081020610 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 01 14:29:48 compute-0 sshd-session[226612]: Connection closed by authenticating user root 101.47.181.100 port 38650 [preauth]
Oct 01 14:29:49 compute-0 nova_compute[192698]: 2025-10-01 14:29:49.798 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:29:49 compute-0 sshd-session[226742]: Failed password for root from 193.46.255.99 port 36860 ssh2
Oct 01 14:29:50 compute-0 unix_chkpwd[226772]: password check failed for user (root)
Oct 01 14:29:51 compute-0 nova_compute[192698]: 2025-10-01 14:29:51.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:29:51 compute-0 unix_chkpwd[226773]: password check failed for user (root)
Oct 01 14:29:51 compute-0 sshd-session[226770]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=101.47.181.100  user=root
Oct 01 14:29:52 compute-0 sshd-session[226742]: Failed password for root from 193.46.255.99 port 36860 ssh2
Oct 01 14:29:53 compute-0 sshd-session[226770]: Failed password for root from 101.47.181.100 port 40664 ssh2
Oct 01 14:29:53 compute-0 unix_chkpwd[226774]: password check failed for user (root)
Oct 01 14:29:54 compute-0 nova_compute[192698]: 2025-10-01 14:29:54.798 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:29:55 compute-0 nova_compute[192698]: 2025-10-01 14:29:55.472 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:29:55 compute-0 sshd-session[226742]: Failed password for root from 193.46.255.99 port 36860 ssh2
Oct 01 14:29:56 compute-0 sshd-session[226742]: Received disconnect from 193.46.255.99 port 36860:11:  [preauth]
Oct 01 14:29:56 compute-0 sshd-session[226742]: Disconnected from authenticating user root 193.46.255.99 port 36860 [preauth]
Oct 01 14:29:56 compute-0 sshd-session[226742]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.99  user=root
Oct 01 14:29:56 compute-0 nova_compute[192698]: 2025-10-01 14:29:56.548 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:29:59 compute-0 podman[226777]: 2025-10-01 14:29:59.182694815 +0000 UTC m=+0.091167244 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent)
Oct 01 14:29:59 compute-0 podman[226778]: 2025-10-01 14:29:59.248419133 +0000 UTC m=+0.152052072 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 01 14:29:59 compute-0 podman[203144]: time="2025-10-01T14:29:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:29:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:29:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:29:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:29:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3030 "" "Go-http-client/1.1"
Oct 01 14:29:59 compute-0 nova_compute[192698]: 2025-10-01 14:29:59.843 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:30:01 compute-0 openstack_network_exporter[205307]: ERROR   14:30:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:30:01 compute-0 openstack_network_exporter[205307]: ERROR   14:30:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:30:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:30:01 compute-0 openstack_network_exporter[205307]: ERROR   14:30:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:30:01 compute-0 openstack_network_exporter[205307]: ERROR   14:30:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:30:01 compute-0 openstack_network_exporter[205307]: ERROR   14:30:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:30:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:30:01 compute-0 nova_compute[192698]: 2025-10-01 14:30:01.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:30:02 compute-0 unix_chkpwd[226822]: password check failed for user (root)
Oct 01 14:30:02 compute-0 sshd-session[226775]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=101.47.181.100  user=root
Oct 01 14:30:03 compute-0 sshd-session[226770]: Connection closed by authenticating user root 101.47.181.100 port 40664 [preauth]
Oct 01 14:30:04 compute-0 sshd-session[226775]: Failed password for root from 101.47.181.100 port 40678 ssh2
Oct 01 14:30:04 compute-0 nova_compute[192698]: 2025-10-01 14:30:04.845 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:30:05 compute-0 sshd-session[226775]: Connection closed by authenticating user root 101.47.181.100 port 40678 [preauth]
Oct 01 14:30:06 compute-0 podman[226825]: 2025-10-01 14:30:06.194975833 +0000 UTC m=+0.102327864 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.openshift.expose-services=, managed_by=edpm_ansible, vcs-type=git, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, name=ubi9-minimal, io.openshift.tags=minimal rhel9, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_id=edpm)
Oct 01 14:30:06 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:30:06.437 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2f:b7:30 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-ac2316c2-cb81-4558-9b5e-4a4794313854', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ac2316c2-cb81-4558-9b5e-4a4794313854', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e30e83299c1e445dbba9473590367e5b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b9cd0b76-6dc9-458b-82d5-27e9ccc0503c, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=3deccf94-530c-46ad-826f-fffb32b268e2) old=Port_Binding(mac=['fa:16:3e:2f:b7:30'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-ac2316c2-cb81-4558-9b5e-4a4794313854', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ac2316c2-cb81-4558-9b5e-4a4794313854', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e30e83299c1e445dbba9473590367e5b', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:30:06 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:30:06.438 103791 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 3deccf94-530c-46ad-826f-fffb32b268e2 in datapath ac2316c2-cb81-4558-9b5e-4a4794313854 updated
Oct 01 14:30:06 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:30:06.439 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ac2316c2-cb81-4558-9b5e-4a4794313854, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 01 14:30:06 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:30:06.441 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[624e1f1b-530a-40d8-ad9b-45b49e71f4fb]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:30:06 compute-0 nova_compute[192698]: 2025-10-01 14:30:06.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:30:09 compute-0 nova_compute[192698]: 2025-10-01 14:30:09.848 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:30:11 compute-0 nova_compute[192698]: 2025-10-01 14:30:11.554 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:30:12 compute-0 podman[226848]: 2025-10-01 14:30:12.176170323 +0000 UTC m=+0.085040339 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, container_name=iscsid, io.buildah.version=1.41.4)
Oct 01 14:30:12 compute-0 podman[226849]: 2025-10-01 14:30:12.20540442 +0000 UTC m=+0.109665502 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd)
Oct 01 14:30:13 compute-0 sshd-session[226488]: ssh_dispatch_run_fatal: Connection from authenticating user root 101.47.181.100 port 47362: Connection timed out [preauth]
Oct 01 14:30:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:30:14.295 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:30:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:30:14.295 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:30:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:30:14.295 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:30:14 compute-0 nova_compute[192698]: 2025-10-01 14:30:14.852 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:30:15 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:30:15.371 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3a:52:21 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-4fcb4077-f89c-4dd0-9234-7cecfcb0f68b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4fcb4077-f89c-4dd0-9234-7cecfcb0f68b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '80bb651087894631addd91dd6ce2ecd0', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5534988e-1fc2-4c5d-a6a7-859fa76c641f, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=195bf3d1-58e1-4b6e-a5d6-d25dfea77e01) old=Port_Binding(mac=['fa:16:3e:3a:52:21'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-4fcb4077-f89c-4dd0-9234-7cecfcb0f68b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4fcb4077-f89c-4dd0-9234-7cecfcb0f68b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '80bb651087894631addd91dd6ce2ecd0', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:30:15 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:30:15.372 103791 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 195bf3d1-58e1-4b6e-a5d6-d25dfea77e01 in datapath 4fcb4077-f89c-4dd0-9234-7cecfcb0f68b updated
Oct 01 14:30:15 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:30:15.373 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4fcb4077-f89c-4dd0-9234-7cecfcb0f68b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 01 14:30:15 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:30:15.374 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[785150c4-ef66-47de-96ad-a58cfd4a7f71]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:30:16 compute-0 nova_compute[192698]: 2025-10-01 14:30:16.557 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:30:17 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:30:17.729 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=28, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'e2:3f:3c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:1d:a6:67:ed:e6'}, ipsec=False) old=SB_Global(nb_cfg=27) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:30:17 compute-0 nova_compute[192698]: 2025-10-01 14:30:17.730 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:30:17 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:30:17.731 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 01 14:30:17 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:30:17.734 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=10cf9814-09fa-4bad-879a-270f9b64eda3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '28'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:30:18 compute-0 unix_chkpwd[226888]: password check failed for user (root)
Oct 01 14:30:18 compute-0 sshd-session[226823]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=101.47.181.100  user=root
Oct 01 14:30:19 compute-0 podman[226889]: 2025-10-01 14:30:19.166362177 +0000 UTC m=+0.085387598 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 01 14:30:19 compute-0 nova_compute[192698]: 2025-10-01 14:30:19.853 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:30:21 compute-0 sshd-session[226823]: Failed password for root from 101.47.181.100 port 45252 ssh2
Oct 01 14:30:21 compute-0 sshd-session[226823]: Connection closed by authenticating user root 101.47.181.100 port 45252 [preauth]
Oct 01 14:30:21 compute-0 nova_compute[192698]: 2025-10-01 14:30:21.560 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:30:23 compute-0 unix_chkpwd[226915]: password check failed for user (root)
Oct 01 14:30:23 compute-0 sshd-session[226913]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=101.47.181.100  user=root
Oct 01 14:30:24 compute-0 nova_compute[192698]: 2025-10-01 14:30:24.856 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:30:25 compute-0 sshd-session[226913]: Failed password for root from 101.47.181.100 port 52156 ssh2
Oct 01 14:30:25 compute-0 sshd-session[226913]: Connection closed by authenticating user root 101.47.181.100 port 52156 [preauth]
Oct 01 14:30:26 compute-0 nova_compute[192698]: 2025-10-01 14:30:26.562 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:30:27 compute-0 nova_compute[192698]: 2025-10-01 14:30:27.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:30:28 compute-0 unix_chkpwd[226918]: password check failed for user (root)
Oct 01 14:30:28 compute-0 sshd-session[226916]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=101.47.181.100  user=root
Oct 01 14:30:29 compute-0 ovn_controller[94909]: 2025-10-01T14:30:29Z|00229|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory
Oct 01 14:30:29 compute-0 podman[203144]: time="2025-10-01T14:30:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:30:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:30:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:30:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:30:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3023 "" "Go-http-client/1.1"
Oct 01 14:30:29 compute-0 nova_compute[192698]: 2025-10-01 14:30:29.858 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:30:29 compute-0 nova_compute[192698]: 2025-10-01 14:30:29.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:30:29 compute-0 nova_compute[192698]: 2025-10-01 14:30:29.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:30:30 compute-0 podman[226919]: 2025-10-01 14:30:30.173000385 +0000 UTC m=+0.082731126 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct 01 14:30:30 compute-0 podman[226920]: 2025-10-01 14:30:30.202595642 +0000 UTC m=+0.107069472 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20250930, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible)
Oct 01 14:30:30 compute-0 nova_compute[192698]: 2025-10-01 14:30:30.458 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:30:30 compute-0 nova_compute[192698]: 2025-10-01 14:30:30.459 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:30:30 compute-0 nova_compute[192698]: 2025-10-01 14:30:30.459 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:30:30 compute-0 nova_compute[192698]: 2025-10-01 14:30:30.460 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 01 14:30:30 compute-0 nova_compute[192698]: 2025-10-01 14:30:30.686 2 WARNING nova.virt.libvirt.driver [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:30:30 compute-0 nova_compute[192698]: 2025-10-01 14:30:30.688 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:30:30 compute-0 nova_compute[192698]: 2025-10-01 14:30:30.718 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.031s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:30:30 compute-0 nova_compute[192698]: 2025-10-01 14:30:30.719 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5854MB free_disk=73.30251693725586GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 01 14:30:30 compute-0 nova_compute[192698]: 2025-10-01 14:30:30.720 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:30:30 compute-0 nova_compute[192698]: 2025-10-01 14:30:30.721 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:30:30 compute-0 sshd-session[226916]: Failed password for root from 101.47.181.100 port 52162 ssh2
Oct 01 14:30:31 compute-0 openstack_network_exporter[205307]: ERROR   14:30:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:30:31 compute-0 openstack_network_exporter[205307]: ERROR   14:30:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:30:31 compute-0 openstack_network_exporter[205307]: ERROR   14:30:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:30:31 compute-0 openstack_network_exporter[205307]: ERROR   14:30:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:30:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:30:31 compute-0 openstack_network_exporter[205307]: ERROR   14:30:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:30:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:30:31 compute-0 nova_compute[192698]: 2025-10-01 14:30:31.564 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:30:31 compute-0 nova_compute[192698]: 2025-10-01 14:30:31.772 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 01 14:30:31 compute-0 nova_compute[192698]: 2025-10-01 14:30:31.773 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:30:30 up  1:29,  0 user,  load average: 0.26, 0.19, 0.26\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 01 14:30:31 compute-0 nova_compute[192698]: 2025-10-01 14:30:31.797 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:30:31 compute-0 sshd-session[226916]: Connection closed by authenticating user root 101.47.181.100 port 52162 [preauth]
Oct 01 14:30:32 compute-0 nova_compute[192698]: 2025-10-01 14:30:32.305 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:30:32 compute-0 nova_compute[192698]: 2025-10-01 14:30:32.847 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 01 14:30:32 compute-0 nova_compute[192698]: 2025-10-01 14:30:32.847 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.127s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:30:33 compute-0 nova_compute[192698]: 2025-10-01 14:30:33.848 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:30:33 compute-0 nova_compute[192698]: 2025-10-01 14:30:33.849 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:30:34 compute-0 nova_compute[192698]: 2025-10-01 14:30:34.903 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:30:36 compute-0 nova_compute[192698]: 2025-10-01 14:30:36.566 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:30:36 compute-0 unix_chkpwd[226967]: password check failed for user (root)
Oct 01 14:30:36 compute-0 sshd-session[226965]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=101.47.181.100  user=root
Oct 01 14:30:37 compute-0 podman[226968]: 2025-10-01 14:30:37.161674198 +0000 UTC m=+0.071570996 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., release=1755695350, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64)
Oct 01 14:30:37 compute-0 nova_compute[192698]: 2025-10-01 14:30:37.926 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:30:38 compute-0 sshd-session[226965]: Failed password for root from 101.47.181.100 port 54442 ssh2
Oct 01 14:30:38 compute-0 nova_compute[192698]: 2025-10-01 14:30:38.914 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:30:38 compute-0 nova_compute[192698]: 2025-10-01 14:30:38.924 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:30:38 compute-0 nova_compute[192698]: 2025-10-01 14:30:38.925 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 01 14:30:39 compute-0 sshd-session[226965]: Connection closed by authenticating user root 101.47.181.100 port 54442 [preauth]
Oct 01 14:30:39 compute-0 nova_compute[192698]: 2025-10-01 14:30:39.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:30:41 compute-0 nova_compute[192698]: 2025-10-01 14:30:41.569 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:30:42 compute-0 unix_chkpwd[226992]: password check failed for user (root)
Oct 01 14:30:42 compute-0 sshd-session[226989]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=101.47.181.100  user=root
Oct 01 14:30:43 compute-0 podman[226993]: 2025-10-01 14:30:43.167934774 +0000 UTC m=+0.078033261 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, config_id=iscsid, org.label-schema.license=GPLv2)
Oct 01 14:30:43 compute-0 podman[226994]: 2025-10-01 14:30:43.183189934 +0000 UTC m=+0.089112148 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true)
Oct 01 14:30:44 compute-0 nova_compute[192698]: 2025-10-01 14:30:44.945 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:30:45 compute-0 sshd-session[226989]: Failed password for root from 101.47.181.100 port 37972 ssh2
Oct 01 14:30:45 compute-0 sshd-session[226989]: Connection closed by authenticating user root 101.47.181.100 port 37972 [preauth]
Oct 01 14:30:46 compute-0 nova_compute[192698]: 2025-10-01 14:30:46.096 2 DEBUG oslo_concurrency.lockutils [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Acquiring lock "3665e2b0-b313-4242-af04-45597829e681" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:30:46 compute-0 nova_compute[192698]: 2025-10-01 14:30:46.097 2 DEBUG oslo_concurrency.lockutils [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Lock "3665e2b0-b313-4242-af04-45597829e681" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:30:46 compute-0 nova_compute[192698]: 2025-10-01 14:30:46.571 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:30:46 compute-0 nova_compute[192698]: 2025-10-01 14:30:46.604 2 DEBUG nova.compute.manager [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] [instance: 3665e2b0-b313-4242-af04-45597829e681] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Oct 01 14:30:47 compute-0 nova_compute[192698]: 2025-10-01 14:30:47.168 2 DEBUG oslo_concurrency.lockutils [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:30:47 compute-0 nova_compute[192698]: 2025-10-01 14:30:47.168 2 DEBUG oslo_concurrency.lockutils [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:30:47 compute-0 nova_compute[192698]: 2025-10-01 14:30:47.175 2 DEBUG nova.virt.hardware [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Oct 01 14:30:47 compute-0 nova_compute[192698]: 2025-10-01 14:30:47.176 2 INFO nova.compute.claims [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] [instance: 3665e2b0-b313-4242-af04-45597829e681] Claim successful on node compute-0.ctlplane.example.com
Oct 01 14:30:48 compute-0 unix_chkpwd[227033]: password check failed for user (root)
Oct 01 14:30:48 compute-0 sshd-session[227031]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=101.47.181.100  user=root
Oct 01 14:30:48 compute-0 nova_compute[192698]: 2025-10-01 14:30:48.237 2 DEBUG nova.compute.provider_tree [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:30:48 compute-0 nova_compute[192698]: 2025-10-01 14:30:48.747 2 DEBUG nova.scheduler.client.report [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:30:49 compute-0 nova_compute[192698]: 2025-10-01 14:30:49.256 2 DEBUG oslo_concurrency.lockutils [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.088s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:30:49 compute-0 nova_compute[192698]: 2025-10-01 14:30:49.257 2 DEBUG nova.compute.manager [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] [instance: 3665e2b0-b313-4242-af04-45597829e681] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Oct 01 14:30:49 compute-0 sshd-session[227031]: Failed password for root from 101.47.181.100 port 37980 ssh2
Oct 01 14:30:49 compute-0 nova_compute[192698]: 2025-10-01 14:30:49.773 2 DEBUG nova.compute.manager [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] [instance: 3665e2b0-b313-4242-af04-45597829e681] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Oct 01 14:30:49 compute-0 nova_compute[192698]: 2025-10-01 14:30:49.774 2 DEBUG nova.network.neutron [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] [instance: 3665e2b0-b313-4242-af04-45597829e681] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Oct 01 14:30:49 compute-0 nova_compute[192698]: 2025-10-01 14:30:49.774 2 WARNING neutronclient.v2_0.client [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:30:49 compute-0 nova_compute[192698]: 2025-10-01 14:30:49.775 2 WARNING neutronclient.v2_0.client [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:30:49 compute-0 nova_compute[192698]: 2025-10-01 14:30:49.948 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:30:50 compute-0 podman[227034]: 2025-10-01 14:30:50.193843729 +0000 UTC m=+0.103284410 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 01 14:30:50 compute-0 nova_compute[192698]: 2025-10-01 14:30:50.287 2 INFO nova.virt.libvirt.driver [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] [instance: 3665e2b0-b313-4242-af04-45597829e681] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 01 14:30:50 compute-0 nova_compute[192698]: 2025-10-01 14:30:50.801 2 DEBUG nova.compute.manager [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] [instance: 3665e2b0-b313-4242-af04-45597829e681] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Oct 01 14:30:51 compute-0 sshd-session[227031]: Connection closed by authenticating user root 101.47.181.100 port 37980 [preauth]
Oct 01 14:30:51 compute-0 nova_compute[192698]: 2025-10-01 14:30:51.572 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:30:51 compute-0 nova_compute[192698]: 2025-10-01 14:30:51.713 2 DEBUG nova.network.neutron [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] [instance: 3665e2b0-b313-4242-af04-45597829e681] Successfully created port: f853ffac-a897-4f2b-9131-4b4cc7ffdb18 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Oct 01 14:30:51 compute-0 nova_compute[192698]: 2025-10-01 14:30:51.819 2 DEBUG nova.compute.manager [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] [instance: 3665e2b0-b313-4242-af04-45597829e681] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Oct 01 14:30:51 compute-0 nova_compute[192698]: 2025-10-01 14:30:51.820 2 DEBUG nova.virt.libvirt.driver [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] [instance: 3665e2b0-b313-4242-af04-45597829e681] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Oct 01 14:30:51 compute-0 nova_compute[192698]: 2025-10-01 14:30:51.821 2 INFO nova.virt.libvirt.driver [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] [instance: 3665e2b0-b313-4242-af04-45597829e681] Creating image(s)
Oct 01 14:30:51 compute-0 nova_compute[192698]: 2025-10-01 14:30:51.821 2 DEBUG oslo_concurrency.lockutils [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Acquiring lock "/var/lib/nova/instances/3665e2b0-b313-4242-af04-45597829e681/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:30:51 compute-0 nova_compute[192698]: 2025-10-01 14:30:51.822 2 DEBUG oslo_concurrency.lockutils [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Lock "/var/lib/nova/instances/3665e2b0-b313-4242-af04-45597829e681/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:30:51 compute-0 nova_compute[192698]: 2025-10-01 14:30:51.822 2 DEBUG oslo_concurrency.lockutils [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Lock "/var/lib/nova/instances/3665e2b0-b313-4242-af04-45597829e681/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:30:51 compute-0 nova_compute[192698]: 2025-10-01 14:30:51.823 2 DEBUG oslo_utils.imageutils.format_inspector [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:30:51 compute-0 nova_compute[192698]: 2025-10-01 14:30:51.827 2 DEBUG oslo_utils.imageutils.format_inspector [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:30:51 compute-0 nova_compute[192698]: 2025-10-01 14:30:51.828 2 DEBUG oslo_concurrency.processutils [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:30:51 compute-0 nova_compute[192698]: 2025-10-01 14:30:51.913 2 DEBUG oslo_concurrency.processutils [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:30:51 compute-0 nova_compute[192698]: 2025-10-01 14:30:51.914 2 DEBUG oslo_concurrency.lockutils [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Acquiring lock "f477473ce09fdc00484ca839f539813eb2fee546" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:30:51 compute-0 nova_compute[192698]: 2025-10-01 14:30:51.915 2 DEBUG oslo_concurrency.lockutils [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Lock "f477473ce09fdc00484ca839f539813eb2fee546" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:30:51 compute-0 nova_compute[192698]: 2025-10-01 14:30:51.916 2 DEBUG oslo_utils.imageutils.format_inspector [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:30:51 compute-0 nova_compute[192698]: 2025-10-01 14:30:51.919 2 DEBUG oslo_utils.imageutils.format_inspector [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:30:51 compute-0 nova_compute[192698]: 2025-10-01 14:30:51.920 2 DEBUG oslo_concurrency.processutils [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:30:51 compute-0 nova_compute[192698]: 2025-10-01 14:30:51.972 2 DEBUG oslo_concurrency.processutils [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:30:51 compute-0 nova_compute[192698]: 2025-10-01 14:30:51.973 2 DEBUG oslo_concurrency.processutils [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546,backing_fmt=raw /var/lib/nova/instances/3665e2b0-b313-4242-af04-45597829e681/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:30:52 compute-0 nova_compute[192698]: 2025-10-01 14:30:52.022 2 DEBUG oslo_concurrency.processutils [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546,backing_fmt=raw /var/lib/nova/instances/3665e2b0-b313-4242-af04-45597829e681/disk 1073741824" returned: 0 in 0.049s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:30:52 compute-0 nova_compute[192698]: 2025-10-01 14:30:52.023 2 DEBUG oslo_concurrency.lockutils [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Lock "f477473ce09fdc00484ca839f539813eb2fee546" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.108s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:30:52 compute-0 nova_compute[192698]: 2025-10-01 14:30:52.023 2 DEBUG oslo_concurrency.processutils [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:30:52 compute-0 nova_compute[192698]: 2025-10-01 14:30:52.074 2 DEBUG oslo_concurrency.processutils [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:30:52 compute-0 nova_compute[192698]: 2025-10-01 14:30:52.075 2 DEBUG nova.virt.disk.api [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Checking if we can resize image /var/lib/nova/instances/3665e2b0-b313-4242-af04-45597829e681/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 01 14:30:52 compute-0 nova_compute[192698]: 2025-10-01 14:30:52.076 2 DEBUG oslo_concurrency.processutils [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3665e2b0-b313-4242-af04-45597829e681/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:30:52 compute-0 nova_compute[192698]: 2025-10-01 14:30:52.134 2 DEBUG oslo_concurrency.processutils [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3665e2b0-b313-4242-af04-45597829e681/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:30:52 compute-0 nova_compute[192698]: 2025-10-01 14:30:52.135 2 DEBUG nova.virt.disk.api [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Cannot resize image /var/lib/nova/instances/3665e2b0-b313-4242-af04-45597829e681/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 01 14:30:52 compute-0 nova_compute[192698]: 2025-10-01 14:30:52.136 2 DEBUG nova.virt.libvirt.driver [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] [instance: 3665e2b0-b313-4242-af04-45597829e681] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Oct 01 14:30:52 compute-0 nova_compute[192698]: 2025-10-01 14:30:52.136 2 DEBUG nova.virt.libvirt.driver [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] [instance: 3665e2b0-b313-4242-af04-45597829e681] Ensure instance console log exists: /var/lib/nova/instances/3665e2b0-b313-4242-af04-45597829e681/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Oct 01 14:30:52 compute-0 nova_compute[192698]: 2025-10-01 14:30:52.137 2 DEBUG oslo_concurrency.lockutils [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:30:52 compute-0 nova_compute[192698]: 2025-10-01 14:30:52.137 2 DEBUG oslo_concurrency.lockutils [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:30:52 compute-0 nova_compute[192698]: 2025-10-01 14:30:52.138 2 DEBUG oslo_concurrency.lockutils [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:30:52 compute-0 nova_compute[192698]: 2025-10-01 14:30:52.390 2 DEBUG nova.network.neutron [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] [instance: 3665e2b0-b313-4242-af04-45597829e681] Successfully updated port: f853ffac-a897-4f2b-9131-4b4cc7ffdb18 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Oct 01 14:30:52 compute-0 nova_compute[192698]: 2025-10-01 14:30:52.476 2 DEBUG nova.compute.manager [req-c8a29f01-cfe6-4853-8679-950ffa36ba91 req-a895b530-247d-4105-b050-3fa5e1fcfda7 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 3665e2b0-b313-4242-af04-45597829e681] Received event network-changed-f853ffac-a897-4f2b-9131-4b4cc7ffdb18 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:30:52 compute-0 nova_compute[192698]: 2025-10-01 14:30:52.477 2 DEBUG nova.compute.manager [req-c8a29f01-cfe6-4853-8679-950ffa36ba91 req-a895b530-247d-4105-b050-3fa5e1fcfda7 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 3665e2b0-b313-4242-af04-45597829e681] Refreshing instance network info cache due to event network-changed-f853ffac-a897-4f2b-9131-4b4cc7ffdb18. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Oct 01 14:30:52 compute-0 nova_compute[192698]: 2025-10-01 14:30:52.477 2 DEBUG oslo_concurrency.lockutils [req-c8a29f01-cfe6-4853-8679-950ffa36ba91 req-a895b530-247d-4105-b050-3fa5e1fcfda7 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "refresh_cache-3665e2b0-b313-4242-af04-45597829e681" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 01 14:30:52 compute-0 nova_compute[192698]: 2025-10-01 14:30:52.478 2 DEBUG oslo_concurrency.lockutils [req-c8a29f01-cfe6-4853-8679-950ffa36ba91 req-a895b530-247d-4105-b050-3fa5e1fcfda7 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquired lock "refresh_cache-3665e2b0-b313-4242-af04-45597829e681" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 01 14:30:52 compute-0 nova_compute[192698]: 2025-10-01 14:30:52.478 2 DEBUG nova.network.neutron [req-c8a29f01-cfe6-4853-8679-950ffa36ba91 req-a895b530-247d-4105-b050-3fa5e1fcfda7 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 3665e2b0-b313-4242-af04-45597829e681] Refreshing network info cache for port f853ffac-a897-4f2b-9131-4b4cc7ffdb18 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Oct 01 14:30:52 compute-0 nova_compute[192698]: 2025-10-01 14:30:52.900 2 DEBUG oslo_concurrency.lockutils [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Acquiring lock "refresh_cache-3665e2b0-b313-4242-af04-45597829e681" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 01 14:30:52 compute-0 nova_compute[192698]: 2025-10-01 14:30:52.986 2 WARNING neutronclient.v2_0.client [req-c8a29f01-cfe6-4853-8679-950ffa36ba91 req-a895b530-247d-4105-b050-3fa5e1fcfda7 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:30:53 compute-0 nova_compute[192698]: 2025-10-01 14:30:53.569 2 DEBUG nova.network.neutron [req-c8a29f01-cfe6-4853-8679-950ffa36ba91 req-a895b530-247d-4105-b050-3fa5e1fcfda7 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 3665e2b0-b313-4242-af04-45597829e681] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 01 14:30:53 compute-0 nova_compute[192698]: 2025-10-01 14:30:53.745 2 DEBUG nova.network.neutron [req-c8a29f01-cfe6-4853-8679-950ffa36ba91 req-a895b530-247d-4105-b050-3fa5e1fcfda7 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 3665e2b0-b313-4242-af04-45597829e681] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:30:54 compute-0 nova_compute[192698]: 2025-10-01 14:30:54.252 2 DEBUG oslo_concurrency.lockutils [req-c8a29f01-cfe6-4853-8679-950ffa36ba91 req-a895b530-247d-4105-b050-3fa5e1fcfda7 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Releasing lock "refresh_cache-3665e2b0-b313-4242-af04-45597829e681" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 01 14:30:54 compute-0 nova_compute[192698]: 2025-10-01 14:30:54.253 2 DEBUG oslo_concurrency.lockutils [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Acquired lock "refresh_cache-3665e2b0-b313-4242-af04-45597829e681" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 01 14:30:54 compute-0 nova_compute[192698]: 2025-10-01 14:30:54.254 2 DEBUG nova.network.neutron [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] [instance: 3665e2b0-b313-4242-af04-45597829e681] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 01 14:30:54 compute-0 nova_compute[192698]: 2025-10-01 14:30:54.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:30:55 compute-0 nova_compute[192698]: 2025-10-01 14:30:55.274 2 DEBUG nova.network.neutron [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] [instance: 3665e2b0-b313-4242-af04-45597829e681] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 01 14:30:55 compute-0 nova_compute[192698]: 2025-10-01 14:30:55.488 2 WARNING neutronclient.v2_0.client [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:30:55 compute-0 nova_compute[192698]: 2025-10-01 14:30:55.659 2 DEBUG nova.network.neutron [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] [instance: 3665e2b0-b313-4242-af04-45597829e681] Updating instance_info_cache with network_info: [{"id": "f853ffac-a897-4f2b-9131-4b4cc7ffdb18", "address": "fa:16:3e:8a:1c:76", "network": {"id": "ac2316c2-cb81-4558-9b5e-4a4794313854", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1793282763-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e30e83299c1e445dbba9473590367e5b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf853ffac-a8", "ovs_interfaceid": "f853ffac-a897-4f2b-9131-4b4cc7ffdb18", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:30:56 compute-0 nova_compute[192698]: 2025-10-01 14:30:56.168 2 DEBUG oslo_concurrency.lockutils [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Releasing lock "refresh_cache-3665e2b0-b313-4242-af04-45597829e681" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 01 14:30:56 compute-0 nova_compute[192698]: 2025-10-01 14:30:56.169 2 DEBUG nova.compute.manager [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] [instance: 3665e2b0-b313-4242-af04-45597829e681] Instance network_info: |[{"id": "f853ffac-a897-4f2b-9131-4b4cc7ffdb18", "address": "fa:16:3e:8a:1c:76", "network": {"id": "ac2316c2-cb81-4558-9b5e-4a4794313854", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1793282763-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e30e83299c1e445dbba9473590367e5b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf853ffac-a8", "ovs_interfaceid": "f853ffac-a897-4f2b-9131-4b4cc7ffdb18", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Oct 01 14:30:56 compute-0 nova_compute[192698]: 2025-10-01 14:30:56.173 2 DEBUG nova.virt.libvirt.driver [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] [instance: 3665e2b0-b313-4242-af04-45597829e681] Start _get_guest_xml network_info=[{"id": "f853ffac-a897-4f2b-9131-4b4cc7ffdb18", "address": "fa:16:3e:8a:1c:76", "network": {"id": "ac2316c2-cb81-4558-9b5e-4a4794313854", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1793282763-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e30e83299c1e445dbba9473590367e5b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf853ffac-a8", "ovs_interfaceid": "f853ffac-a897-4f2b-9131-4b4cc7ffdb18", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-01T13:57:39Z,direct_url=<?>,disk_format='qcow2',id=48696e9b-a20d-4bf6-8ac2-6438fe748ab6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9dacac6049d34f02846f752af09ae16f',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-01T13:57:40Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encrypted': False, 'encryption_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_options': None, 'device_name': '/dev/vda', 'guest_format': None, 'image_id': '48696e9b-a20d-4bf6-8ac2-6438fe748ab6'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Oct 01 14:30:56 compute-0 nova_compute[192698]: 2025-10-01 14:30:56.180 2 WARNING nova.virt.libvirt.driver [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:30:56 compute-0 nova_compute[192698]: 2025-10-01 14:30:56.182 2 DEBUG nova.virt.driver [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='48696e9b-a20d-4bf6-8ac2-6438fe748ab6', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-344748452', uuid='3665e2b0-b313-4242-af04-45597829e681'), owner=OwnerMeta(userid='b71a58b28129460f94de238eedc8965c', username='tempest-TestExecuteVmWorkloadBalanceStrategy-1927341926-project-admin', projectid='80bb651087894631addd91dd6ce2ecd0', projectname='tempest-TestExecuteVmWorkloadBalanceStrategy-1927341926'), image=ImageMeta(id='48696e9b-a20d-4bf6-8ac2-6438fe748ab6', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='69702c4b-38f2-49d1-96d5-85671652c67e', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "f853ffac-a897-4f2b-9131-4b4cc7ffdb18", "address": "fa:16:3e:8a:1c:76", "network": {"id": "ac2316c2-cb81-4558-9b5e-4a4794313854", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1793282763-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e30e83299c1e445dbba9473590367e5b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf853ffac-a8", "ovs_interfaceid": "f853ffac-a897-4f2b-9131-4b4cc7ffdb18", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759329056.1827242) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Oct 01 14:30:56 compute-0 nova_compute[192698]: 2025-10-01 14:30:56.190 2 DEBUG nova.virt.libvirt.host [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Oct 01 14:30:56 compute-0 nova_compute[192698]: 2025-10-01 14:30:56.191 2 DEBUG nova.virt.libvirt.host [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Oct 01 14:30:56 compute-0 nova_compute[192698]: 2025-10-01 14:30:56.194 2 DEBUG nova.virt.libvirt.host [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Oct 01 14:30:56 compute-0 nova_compute[192698]: 2025-10-01 14:30:56.195 2 DEBUG nova.virt.libvirt.host [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Oct 01 14:30:56 compute-0 nova_compute[192698]: 2025-10-01 14:30:56.196 2 DEBUG nova.virt.libvirt.driver [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Oct 01 14:30:56 compute-0 nova_compute[192698]: 2025-10-01 14:30:56.196 2 DEBUG nova.virt.hardware [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-01T13:57:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='69702c4b-38f2-49d1-96d5-85671652c67e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-01T13:57:39Z,direct_url=<?>,disk_format='qcow2',id=48696e9b-a20d-4bf6-8ac2-6438fe748ab6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9dacac6049d34f02846f752af09ae16f',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-01T13:57:40Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Oct 01 14:30:56 compute-0 nova_compute[192698]: 2025-10-01 14:30:56.197 2 DEBUG nova.virt.hardware [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Oct 01 14:30:56 compute-0 nova_compute[192698]: 2025-10-01 14:30:56.198 2 DEBUG nova.virt.hardware [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Oct 01 14:30:56 compute-0 nova_compute[192698]: 2025-10-01 14:30:56.198 2 DEBUG nova.virt.hardware [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Oct 01 14:30:56 compute-0 nova_compute[192698]: 2025-10-01 14:30:56.198 2 DEBUG nova.virt.hardware [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Oct 01 14:30:56 compute-0 nova_compute[192698]: 2025-10-01 14:30:56.199 2 DEBUG nova.virt.hardware [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Oct 01 14:30:56 compute-0 nova_compute[192698]: 2025-10-01 14:30:56.199 2 DEBUG nova.virt.hardware [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Oct 01 14:30:56 compute-0 nova_compute[192698]: 2025-10-01 14:30:56.200 2 DEBUG nova.virt.hardware [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Oct 01 14:30:56 compute-0 nova_compute[192698]: 2025-10-01 14:30:56.200 2 DEBUG nova.virt.hardware [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Oct 01 14:30:56 compute-0 nova_compute[192698]: 2025-10-01 14:30:56.200 2 DEBUG nova.virt.hardware [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Oct 01 14:30:56 compute-0 nova_compute[192698]: 2025-10-01 14:30:56.201 2 DEBUG nova.virt.hardware [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Oct 01 14:30:56 compute-0 nova_compute[192698]: 2025-10-01 14:30:56.208 2 DEBUG nova.virt.libvirt.vif [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-01T14:30:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-344748452',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-344748452',id=29,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='80bb651087894631addd91dd6ce2ecd0',ramdisk_id='',reservation_id='r-ep8em9s8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1927341926',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1927341926-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-01T14:30:50Z,user_data=None,user_id='b71a58b28129460f94de238eedc8965c',uuid=3665e2b0-b313-4242-af04-45597829e681,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f853ffac-a897-4f2b-9131-4b4cc7ffdb18", "address": "fa:16:3e:8a:1c:76", "network": {"id": "ac2316c2-cb81-4558-9b5e-4a4794313854", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1793282763-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e30e83299c1e445dbba9473590367e5b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf853ffac-a8", "ovs_interfaceid": "f853ffac-a897-4f2b-9131-4b4cc7ffdb18", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Oct 01 14:30:56 compute-0 nova_compute[192698]: 2025-10-01 14:30:56.208 2 DEBUG nova.network.os_vif_util [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Converting VIF {"id": "f853ffac-a897-4f2b-9131-4b4cc7ffdb18", "address": "fa:16:3e:8a:1c:76", "network": {"id": "ac2316c2-cb81-4558-9b5e-4a4794313854", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1793282763-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e30e83299c1e445dbba9473590367e5b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf853ffac-a8", "ovs_interfaceid": "f853ffac-a897-4f2b-9131-4b4cc7ffdb18", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:30:56 compute-0 nova_compute[192698]: 2025-10-01 14:30:56.210 2 DEBUG nova.network.os_vif_util [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8a:1c:76,bridge_name='br-int',has_traffic_filtering=True,id=f853ffac-a897-4f2b-9131-4b4cc7ffdb18,network=Network(ac2316c2-cb81-4558-9b5e-4a4794313854),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf853ffac-a8') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:30:56 compute-0 nova_compute[192698]: 2025-10-01 14:30:56.211 2 DEBUG nova.objects.instance [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Lazy-loading 'pci_devices' on Instance uuid 3665e2b0-b313-4242-af04-45597829e681 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:30:56 compute-0 nova_compute[192698]: 2025-10-01 14:30:56.575 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:30:56 compute-0 nova_compute[192698]: 2025-10-01 14:30:56.732 2 DEBUG nova.virt.libvirt.driver [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] [instance: 3665e2b0-b313-4242-af04-45597829e681] End _get_guest_xml xml=<domain type="kvm">
Oct 01 14:30:56 compute-0 nova_compute[192698]:   <uuid>3665e2b0-b313-4242-af04-45597829e681</uuid>
Oct 01 14:30:56 compute-0 nova_compute[192698]:   <name>instance-0000001d</name>
Oct 01 14:30:56 compute-0 nova_compute[192698]:   <memory>131072</memory>
Oct 01 14:30:56 compute-0 nova_compute[192698]:   <vcpu>1</vcpu>
Oct 01 14:30:56 compute-0 nova_compute[192698]:   <metadata>
Oct 01 14:30:56 compute-0 nova_compute[192698]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 01 14:30:56 compute-0 nova_compute[192698]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Oct 01 14:30:56 compute-0 nova_compute[192698]:       <nova:name>tempest-TestExecuteVmWorkloadBalanceStrategy-server-344748452</nova:name>
Oct 01 14:30:56 compute-0 nova_compute[192698]:       <nova:creationTime>2025-10-01 14:30:56</nova:creationTime>
Oct 01 14:30:56 compute-0 nova_compute[192698]:       <nova:flavor name="m1.nano" id="69702c4b-38f2-49d1-96d5-85671652c67e">
Oct 01 14:30:56 compute-0 nova_compute[192698]:         <nova:memory>128</nova:memory>
Oct 01 14:30:56 compute-0 nova_compute[192698]:         <nova:disk>1</nova:disk>
Oct 01 14:30:56 compute-0 nova_compute[192698]:         <nova:swap>0</nova:swap>
Oct 01 14:30:56 compute-0 nova_compute[192698]:         <nova:ephemeral>0</nova:ephemeral>
Oct 01 14:30:56 compute-0 nova_compute[192698]:         <nova:vcpus>1</nova:vcpus>
Oct 01 14:30:56 compute-0 nova_compute[192698]:         <nova:extraSpecs>
Oct 01 14:30:56 compute-0 nova_compute[192698]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Oct 01 14:30:56 compute-0 nova_compute[192698]:         </nova:extraSpecs>
Oct 01 14:30:56 compute-0 nova_compute[192698]:       </nova:flavor>
Oct 01 14:30:56 compute-0 nova_compute[192698]:       <nova:image uuid="48696e9b-a20d-4bf6-8ac2-6438fe748ab6">
Oct 01 14:30:56 compute-0 nova_compute[192698]:         <nova:containerFormat>bare</nova:containerFormat>
Oct 01 14:30:56 compute-0 nova_compute[192698]:         <nova:diskFormat>qcow2</nova:diskFormat>
Oct 01 14:30:56 compute-0 nova_compute[192698]:         <nova:minDisk>1</nova:minDisk>
Oct 01 14:30:56 compute-0 nova_compute[192698]:         <nova:minRam>0</nova:minRam>
Oct 01 14:30:56 compute-0 nova_compute[192698]:         <nova:properties>
Oct 01 14:30:56 compute-0 nova_compute[192698]:           <nova:property name="hw_rng_model">virtio</nova:property>
Oct 01 14:30:56 compute-0 nova_compute[192698]:         </nova:properties>
Oct 01 14:30:56 compute-0 nova_compute[192698]:       </nova:image>
Oct 01 14:30:56 compute-0 nova_compute[192698]:       <nova:owner>
Oct 01 14:30:56 compute-0 nova_compute[192698]:         <nova:user uuid="b71a58b28129460f94de238eedc8965c">tempest-TestExecuteVmWorkloadBalanceStrategy-1927341926-project-admin</nova:user>
Oct 01 14:30:56 compute-0 nova_compute[192698]:         <nova:project uuid="80bb651087894631addd91dd6ce2ecd0">tempest-TestExecuteVmWorkloadBalanceStrategy-1927341926</nova:project>
Oct 01 14:30:56 compute-0 nova_compute[192698]:       </nova:owner>
Oct 01 14:30:56 compute-0 nova_compute[192698]:       <nova:root type="image" uuid="48696e9b-a20d-4bf6-8ac2-6438fe748ab6"/>
Oct 01 14:30:56 compute-0 nova_compute[192698]:       <nova:ports>
Oct 01 14:30:56 compute-0 nova_compute[192698]:         <nova:port uuid="f853ffac-a897-4f2b-9131-4b4cc7ffdb18">
Oct 01 14:30:56 compute-0 nova_compute[192698]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Oct 01 14:30:56 compute-0 nova_compute[192698]:         </nova:port>
Oct 01 14:30:56 compute-0 nova_compute[192698]:       </nova:ports>
Oct 01 14:30:56 compute-0 nova_compute[192698]:     </nova:instance>
Oct 01 14:30:56 compute-0 nova_compute[192698]:   </metadata>
Oct 01 14:30:56 compute-0 nova_compute[192698]:   <sysinfo type="smbios">
Oct 01 14:30:56 compute-0 nova_compute[192698]:     <system>
Oct 01 14:30:56 compute-0 nova_compute[192698]:       <entry name="manufacturer">RDO</entry>
Oct 01 14:30:56 compute-0 nova_compute[192698]:       <entry name="product">OpenStack Compute</entry>
Oct 01 14:30:56 compute-0 nova_compute[192698]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Oct 01 14:30:56 compute-0 nova_compute[192698]:       <entry name="serial">3665e2b0-b313-4242-af04-45597829e681</entry>
Oct 01 14:30:56 compute-0 nova_compute[192698]:       <entry name="uuid">3665e2b0-b313-4242-af04-45597829e681</entry>
Oct 01 14:30:56 compute-0 nova_compute[192698]:       <entry name="family">Virtual Machine</entry>
Oct 01 14:30:56 compute-0 nova_compute[192698]:     </system>
Oct 01 14:30:56 compute-0 nova_compute[192698]:   </sysinfo>
Oct 01 14:30:56 compute-0 nova_compute[192698]:   <os>
Oct 01 14:30:56 compute-0 nova_compute[192698]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 01 14:30:56 compute-0 nova_compute[192698]:     <boot dev="hd"/>
Oct 01 14:30:56 compute-0 nova_compute[192698]:     <smbios mode="sysinfo"/>
Oct 01 14:30:56 compute-0 nova_compute[192698]:   </os>
Oct 01 14:30:56 compute-0 nova_compute[192698]:   <features>
Oct 01 14:30:56 compute-0 nova_compute[192698]:     <acpi/>
Oct 01 14:30:56 compute-0 nova_compute[192698]:     <apic/>
Oct 01 14:30:56 compute-0 nova_compute[192698]:     <vmcoreinfo/>
Oct 01 14:30:56 compute-0 nova_compute[192698]:   </features>
Oct 01 14:30:56 compute-0 nova_compute[192698]:   <clock offset="utc">
Oct 01 14:30:56 compute-0 nova_compute[192698]:     <timer name="pit" tickpolicy="delay"/>
Oct 01 14:30:56 compute-0 nova_compute[192698]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 01 14:30:56 compute-0 nova_compute[192698]:     <timer name="hpet" present="no"/>
Oct 01 14:30:56 compute-0 nova_compute[192698]:   </clock>
Oct 01 14:30:56 compute-0 nova_compute[192698]:   <cpu mode="host-model" match="exact">
Oct 01 14:30:56 compute-0 nova_compute[192698]:     <topology sockets="1" cores="1" threads="1"/>
Oct 01 14:30:56 compute-0 nova_compute[192698]:   </cpu>
Oct 01 14:30:56 compute-0 nova_compute[192698]:   <devices>
Oct 01 14:30:56 compute-0 nova_compute[192698]:     <disk type="file" device="disk">
Oct 01 14:30:56 compute-0 nova_compute[192698]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 01 14:30:56 compute-0 nova_compute[192698]:       <source file="/var/lib/nova/instances/3665e2b0-b313-4242-af04-45597829e681/disk"/>
Oct 01 14:30:56 compute-0 nova_compute[192698]:       <target dev="vda" bus="virtio"/>
Oct 01 14:30:56 compute-0 nova_compute[192698]:     </disk>
Oct 01 14:30:56 compute-0 nova_compute[192698]:     <disk type="file" device="cdrom">
Oct 01 14:30:56 compute-0 nova_compute[192698]:       <driver name="qemu" type="raw" cache="none"/>
Oct 01 14:30:56 compute-0 nova_compute[192698]:       <source file="/var/lib/nova/instances/3665e2b0-b313-4242-af04-45597829e681/disk.config"/>
Oct 01 14:30:56 compute-0 nova_compute[192698]:       <target dev="sda" bus="sata"/>
Oct 01 14:30:56 compute-0 nova_compute[192698]:     </disk>
Oct 01 14:30:56 compute-0 nova_compute[192698]:     <interface type="ethernet">
Oct 01 14:30:56 compute-0 nova_compute[192698]:       <mac address="fa:16:3e:8a:1c:76"/>
Oct 01 14:30:56 compute-0 nova_compute[192698]:       <model type="virtio"/>
Oct 01 14:30:56 compute-0 nova_compute[192698]:       <driver name="vhost" rx_queue_size="512"/>
Oct 01 14:30:56 compute-0 nova_compute[192698]:       <mtu size="1442"/>
Oct 01 14:30:56 compute-0 nova_compute[192698]:       <target dev="tapf853ffac-a8"/>
Oct 01 14:30:56 compute-0 nova_compute[192698]:     </interface>
Oct 01 14:30:56 compute-0 nova_compute[192698]:     <serial type="pty">
Oct 01 14:30:56 compute-0 nova_compute[192698]:       <log file="/var/lib/nova/instances/3665e2b0-b313-4242-af04-45597829e681/console.log" append="off"/>
Oct 01 14:30:56 compute-0 nova_compute[192698]:     </serial>
Oct 01 14:30:56 compute-0 nova_compute[192698]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 01 14:30:56 compute-0 nova_compute[192698]:     <video>
Oct 01 14:30:56 compute-0 nova_compute[192698]:       <model type="virtio"/>
Oct 01 14:30:56 compute-0 nova_compute[192698]:     </video>
Oct 01 14:30:56 compute-0 nova_compute[192698]:     <input type="tablet" bus="usb"/>
Oct 01 14:30:56 compute-0 nova_compute[192698]:     <rng model="virtio">
Oct 01 14:30:56 compute-0 nova_compute[192698]:       <backend model="random">/dev/urandom</backend>
Oct 01 14:30:56 compute-0 nova_compute[192698]:     </rng>
Oct 01 14:30:56 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root"/>
Oct 01 14:30:56 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:30:56 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:30:56 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:30:56 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:30:56 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:30:56 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:30:56 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:30:56 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:30:56 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:30:56 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:30:56 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:30:56 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:30:56 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:30:56 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:30:56 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:30:56 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:30:56 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:30:56 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:30:56 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:30:56 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:30:56 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:30:56 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:30:56 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:30:56 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:30:56 compute-0 nova_compute[192698]:     <controller type="usb" index="0"/>
Oct 01 14:30:56 compute-0 nova_compute[192698]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Oct 01 14:30:56 compute-0 nova_compute[192698]:       <stats period="10"/>
Oct 01 14:30:56 compute-0 nova_compute[192698]:     </memballoon>
Oct 01 14:30:56 compute-0 nova_compute[192698]:   </devices>
Oct 01 14:30:56 compute-0 nova_compute[192698]: </domain>
Oct 01 14:30:56 compute-0 nova_compute[192698]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Oct 01 14:30:56 compute-0 nova_compute[192698]: 2025-10-01 14:30:56.733 2 DEBUG nova.compute.manager [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] [instance: 3665e2b0-b313-4242-af04-45597829e681] Preparing to wait for external event network-vif-plugged-f853ffac-a897-4f2b-9131-4b4cc7ffdb18 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Oct 01 14:30:56 compute-0 nova_compute[192698]: 2025-10-01 14:30:56.734 2 DEBUG oslo_concurrency.lockutils [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Acquiring lock "3665e2b0-b313-4242-af04-45597829e681-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:30:56 compute-0 nova_compute[192698]: 2025-10-01 14:30:56.734 2 DEBUG oslo_concurrency.lockutils [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Lock "3665e2b0-b313-4242-af04-45597829e681-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:30:56 compute-0 nova_compute[192698]: 2025-10-01 14:30:56.734 2 DEBUG oslo_concurrency.lockutils [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Lock "3665e2b0-b313-4242-af04-45597829e681-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:30:56 compute-0 nova_compute[192698]: 2025-10-01 14:30:56.735 2 DEBUG nova.virt.libvirt.vif [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-01T14:30:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-344748452',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-344748452',id=29,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='80bb651087894631addd91dd6ce2ecd0',ramdisk_id='',reservation_id='r-ep8em9s8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1927341926',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1927341926-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-01T14:30:50Z,user_data=None,user_id='b71a58b28129460f94de238eedc8965c',uuid=3665e2b0-b313-4242-af04-45597829e681,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f853ffac-a897-4f2b-9131-4b4cc7ffdb18", "address": "fa:16:3e:8a:1c:76", "network": {"id": "ac2316c2-cb81-4558-9b5e-4a4794313854", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1793282763-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e30e83299c1e445dbba9473590367e5b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf853ffac-a8", "ovs_interfaceid": "f853ffac-a897-4f2b-9131-4b4cc7ffdb18", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 01 14:30:56 compute-0 nova_compute[192698]: 2025-10-01 14:30:56.735 2 DEBUG nova.network.os_vif_util [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Converting VIF {"id": "f853ffac-a897-4f2b-9131-4b4cc7ffdb18", "address": "fa:16:3e:8a:1c:76", "network": {"id": "ac2316c2-cb81-4558-9b5e-4a4794313854", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1793282763-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e30e83299c1e445dbba9473590367e5b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf853ffac-a8", "ovs_interfaceid": "f853ffac-a897-4f2b-9131-4b4cc7ffdb18", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:30:56 compute-0 nova_compute[192698]: 2025-10-01 14:30:56.736 2 DEBUG nova.network.os_vif_util [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8a:1c:76,bridge_name='br-int',has_traffic_filtering=True,id=f853ffac-a897-4f2b-9131-4b4cc7ffdb18,network=Network(ac2316c2-cb81-4558-9b5e-4a4794313854),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf853ffac-a8') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:30:56 compute-0 nova_compute[192698]: 2025-10-01 14:30:56.736 2 DEBUG os_vif [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8a:1c:76,bridge_name='br-int',has_traffic_filtering=True,id=f853ffac-a897-4f2b-9131-4b4cc7ffdb18,network=Network(ac2316c2-cb81-4558-9b5e-4a4794313854),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf853ffac-a8') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 01 14:30:56 compute-0 nova_compute[192698]: 2025-10-01 14:30:56.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:30:56 compute-0 nova_compute[192698]: 2025-10-01 14:30:56.737 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:30:56 compute-0 nova_compute[192698]: 2025-10-01 14:30:56.737 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:30:56 compute-0 nova_compute[192698]: 2025-10-01 14:30:56.738 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:30:56 compute-0 nova_compute[192698]: 2025-10-01 14:30:56.738 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '5bf02e4a-2e4f-5c0e-ba00-5251ea0def72', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:30:56 compute-0 nova_compute[192698]: 2025-10-01 14:30:56.775 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:30:56 compute-0 nova_compute[192698]: 2025-10-01 14:30:56.777 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:30:56 compute-0 nova_compute[192698]: 2025-10-01 14:30:56.782 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:30:56 compute-0 nova_compute[192698]: 2025-10-01 14:30:56.783 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf853ffac-a8, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:30:56 compute-0 nova_compute[192698]: 2025-10-01 14:30:56.783 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapf853ffac-a8, col_values=(('qos', UUID('3954863e-2cb7-4f97-bf36-21ae02aa1c21')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:30:56 compute-0 nova_compute[192698]: 2025-10-01 14:30:56.784 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapf853ffac-a8, col_values=(('external_ids', {'iface-id': 'f853ffac-a897-4f2b-9131-4b4cc7ffdb18', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8a:1c:76', 'vm-uuid': '3665e2b0-b313-4242-af04-45597829e681'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:30:56 compute-0 nova_compute[192698]: 2025-10-01 14:30:56.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:30:56 compute-0 NetworkManager[51741]: <info>  [1759329056.7865] manager: (tapf853ffac-a8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/86)
Oct 01 14:30:56 compute-0 nova_compute[192698]: 2025-10-01 14:30:56.787 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:30:56 compute-0 nova_compute[192698]: 2025-10-01 14:30:56.793 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:30:56 compute-0 nova_compute[192698]: 2025-10-01 14:30:56.794 2 INFO os_vif [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8a:1c:76,bridge_name='br-int',has_traffic_filtering=True,id=f853ffac-a897-4f2b-9131-4b4cc7ffdb18,network=Network(ac2316c2-cb81-4558-9b5e-4a4794313854),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf853ffac-a8')
Oct 01 14:30:58 compute-0 nova_compute[192698]: 2025-10-01 14:30:58.349 2 DEBUG nova.virt.libvirt.driver [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 01 14:30:58 compute-0 nova_compute[192698]: 2025-10-01 14:30:58.350 2 DEBUG nova.virt.libvirt.driver [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 01 14:30:58 compute-0 nova_compute[192698]: 2025-10-01 14:30:58.350 2 DEBUG nova.virt.libvirt.driver [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] No VIF found with MAC fa:16:3e:8a:1c:76, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Oct 01 14:30:58 compute-0 nova_compute[192698]: 2025-10-01 14:30:58.351 2 INFO nova.virt.libvirt.driver [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] [instance: 3665e2b0-b313-4242-af04-45597829e681] Using config drive
Oct 01 14:30:58 compute-0 nova_compute[192698]: 2025-10-01 14:30:58.865 2 WARNING neutronclient.v2_0.client [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:30:59 compute-0 nova_compute[192698]: 2025-10-01 14:30:59.479 2 INFO nova.virt.libvirt.driver [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] [instance: 3665e2b0-b313-4242-af04-45597829e681] Creating config drive at /var/lib/nova/instances/3665e2b0-b313-4242-af04-45597829e681/disk.config
Oct 01 14:30:59 compute-0 nova_compute[192698]: 2025-10-01 14:30:59.487 2 DEBUG oslo_concurrency.processutils [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3665e2b0-b313-4242-af04-45597829e681/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpndh1ww5e execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:30:59 compute-0 nova_compute[192698]: 2025-10-01 14:30:59.630 2 DEBUG oslo_concurrency.processutils [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3665e2b0-b313-4242-af04-45597829e681/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpndh1ww5e" returned: 0 in 0.143s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:30:59 compute-0 kernel: tapf853ffac-a8: entered promiscuous mode
Oct 01 14:30:59 compute-0 NetworkManager[51741]: <info>  [1759329059.7192] manager: (tapf853ffac-a8): new Tun device (/org/freedesktop/NetworkManager/Devices/87)
Oct 01 14:30:59 compute-0 nova_compute[192698]: 2025-10-01 14:30:59.720 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:30:59 compute-0 ovn_controller[94909]: 2025-10-01T14:30:59Z|00230|binding|INFO|Claiming lport f853ffac-a897-4f2b-9131-4b4cc7ffdb18 for this chassis.
Oct 01 14:30:59 compute-0 ovn_controller[94909]: 2025-10-01T14:30:59Z|00231|binding|INFO|f853ffac-a897-4f2b-9131-4b4cc7ffdb18: Claiming fa:16:3e:8a:1c:76 10.100.0.12
Oct 01 14:30:59 compute-0 nova_compute[192698]: 2025-10-01 14:30:59.725 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:30:59 compute-0 nova_compute[192698]: 2025-10-01 14:30:59.729 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:30:59 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:30:59.737 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8a:1c:76 10.100.0.12'], port_security=['fa:16:3e:8a:1c:76 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '3665e2b0-b313-4242-af04-45597829e681', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ac2316c2-cb81-4558-9b5e-4a4794313854', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '80bb651087894631addd91dd6ce2ecd0', 'neutron:revision_number': '4', 'neutron:security_group_ids': '89f6228c-210a-405a-bc96-86ab494387a8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b9cd0b76-6dc9-458b-82d5-27e9ccc0503c, chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], logical_port=f853ffac-a897-4f2b-9131-4b4cc7ffdb18) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:30:59 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:30:59.739 103791 INFO neutron.agent.ovn.metadata.agent [-] Port f853ffac-a897-4f2b-9131-4b4cc7ffdb18 in datapath ac2316c2-cb81-4558-9b5e-4a4794313854 bound to our chassis
Oct 01 14:30:59 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:30:59.740 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ac2316c2-cb81-4558-9b5e-4a4794313854
Oct 01 14:30:59 compute-0 podman[203144]: time="2025-10-01T14:30:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:30:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:30:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:30:59 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:30:59.760 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[382c29b9-ef1d-4451-8289-793398d661f1]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:30:59 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:30:59.761 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapac2316c2-c1 in ovnmeta-ac2316c2-cb81-4558-9b5e-4a4794313854 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Oct 01 14:30:59 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:30:59.765 214114 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapac2316c2-c0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Oct 01 14:30:59 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:30:59.765 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[2df36004-bf6a-4578-908f-7a1c5759b1f6]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:30:59 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:30:59.766 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[ef96fce5-bb51-4e43-90a5-e263f54dfa66]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:30:59 compute-0 systemd-udevd[227095]: Network interface NamePolicy= disabled on kernel command line.
Oct 01 14:30:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:30:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3025 "" "Go-http-client/1.1"
Oct 01 14:30:59 compute-0 systemd-machined[152704]: New machine qemu-22-instance-0000001d.
Oct 01 14:30:59 compute-0 NetworkManager[51741]: <info>  [1759329059.7846] device (tapf853ffac-a8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 01 14:30:59 compute-0 NetworkManager[51741]: <info>  [1759329059.7854] device (tapf853ffac-a8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 01 14:30:59 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:30:59.789 103910 DEBUG oslo.privsep.daemon [-] privsep: reply[12e0acc5-cbaf-447e-aabd-65bc316077b4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:30:59 compute-0 nova_compute[192698]: 2025-10-01 14:30:59.795 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:30:59 compute-0 ovn_controller[94909]: 2025-10-01T14:30:59Z|00232|binding|INFO|Setting lport f853ffac-a897-4f2b-9131-4b4cc7ffdb18 ovn-installed in OVS
Oct 01 14:30:59 compute-0 ovn_controller[94909]: 2025-10-01T14:30:59Z|00233|binding|INFO|Setting lport f853ffac-a897-4f2b-9131-4b4cc7ffdb18 up in Southbound
Oct 01 14:30:59 compute-0 systemd[1]: Started Virtual Machine qemu-22-instance-0000001d.
Oct 01 14:30:59 compute-0 nova_compute[192698]: 2025-10-01 14:30:59.799 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:30:59 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:30:59.806 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[6dd41a85-bfa1-4a24-96a6-ea4cc9ffa9df]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:30:59 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:30:59.852 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[4505eb9c-1737-4815-9b29-3b5c3c94e654]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:30:59 compute-0 NetworkManager[51741]: <info>  [1759329059.8620] manager: (tapac2316c2-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/88)
Oct 01 14:30:59 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:30:59.861 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[451710e0-97df-4153-9061-9c056798a786]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:30:59 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:30:59.898 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[1b490ff2-71d2-4035-95e9-dc8835dc4c26]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:30:59 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:30:59.902 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[575864fe-a19a-4ab2-9c58-a55119928cfc]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:30:59 compute-0 NetworkManager[51741]: <info>  [1759329059.9339] device (tapac2316c2-c0): carrier: link connected
Oct 01 14:30:59 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:30:59.942 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[056594ba-1781-4a49-9eb7-254ce2336313]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:30:59 compute-0 nova_compute[192698]: 2025-10-01 14:30:59.952 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:30:59 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:30:59.964 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[15f24355-0e30-410d-b523-82930966ef27]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapac2316c2-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2f:b7:30'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 67], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 542046, 'reachable_time': 24570, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227128, 'error': None, 'target': 'ovnmeta-ac2316c2-cb81-4558-9b5e-4a4794313854', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:30:59 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:30:59.992 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[a14997f4-8aad-4365-8a48-2d0a667a7f5e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2f:b730'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 542046, 'tstamp': 542046}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227129, 'error': None, 'target': 'ovnmeta-ac2316c2-cb81-4558-9b5e-4a4794313854', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:31:00 compute-0 nova_compute[192698]: 2025-10-01 14:31:00.001 2 DEBUG nova.compute.manager [req-c55469e1-8c9d-4a2c-8dea-649297b4736c req-6767e8ac-924e-419b-b799-08d8cc257f65 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 3665e2b0-b313-4242-af04-45597829e681] Received event network-vif-plugged-f853ffac-a897-4f2b-9131-4b4cc7ffdb18 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:31:00 compute-0 nova_compute[192698]: 2025-10-01 14:31:00.001 2 DEBUG oslo_concurrency.lockutils [req-c55469e1-8c9d-4a2c-8dea-649297b4736c req-6767e8ac-924e-419b-b799-08d8cc257f65 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "3665e2b0-b313-4242-af04-45597829e681-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:31:00 compute-0 nova_compute[192698]: 2025-10-01 14:31:00.001 2 DEBUG oslo_concurrency.lockutils [req-c55469e1-8c9d-4a2c-8dea-649297b4736c req-6767e8ac-924e-419b-b799-08d8cc257f65 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "3665e2b0-b313-4242-af04-45597829e681-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:31:00 compute-0 nova_compute[192698]: 2025-10-01 14:31:00.001 2 DEBUG oslo_concurrency.lockutils [req-c55469e1-8c9d-4a2c-8dea-649297b4736c req-6767e8ac-924e-419b-b799-08d8cc257f65 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "3665e2b0-b313-4242-af04-45597829e681-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:31:00 compute-0 nova_compute[192698]: 2025-10-01 14:31:00.001 2 DEBUG nova.compute.manager [req-c55469e1-8c9d-4a2c-8dea-649297b4736c req-6767e8ac-924e-419b-b799-08d8cc257f65 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 3665e2b0-b313-4242-af04-45597829e681] Processing event network-vif-plugged-f853ffac-a897-4f2b-9131-4b4cc7ffdb18 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Oct 01 14:31:00 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:31:00.022 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[fe3297b2-4a01-4675-844b-154a929369c1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapac2316c2-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2f:b7:30'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 3, 'tx_packets': 1, 'rx_bytes': 306, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 3, 'tx_packets': 1, 'rx_bytes': 306, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 67], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 542046, 'reachable_time': 24570, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 3, 'inoctets': 264, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 3, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 264, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 3, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227130, 'error': None, 'target': 'ovnmeta-ac2316c2-cb81-4558-9b5e-4a4794313854', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:31:00 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:31:00.069 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[7d3898a1-e81b-4299-b920-bd298e31f654]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:31:00 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:31:00.158 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[3feae4a4-55ef-4042-81d5-53abe5e9f1ca]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:31:00 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:31:00.160 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapac2316c2-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:31:00 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:31:00.160 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:31:00 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:31:00.161 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapac2316c2-c0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:31:00 compute-0 NetworkManager[51741]: <info>  [1759329060.1643] manager: (tapac2316c2-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/89)
Oct 01 14:31:00 compute-0 nova_compute[192698]: 2025-10-01 14:31:00.163 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:31:00 compute-0 kernel: tapac2316c2-c0: entered promiscuous mode
Oct 01 14:31:00 compute-0 nova_compute[192698]: 2025-10-01 14:31:00.167 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:31:00 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:31:00.168 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapac2316c2-c0, col_values=(('external_ids', {'iface-id': '3deccf94-530c-46ad-826f-fffb32b268e2'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:31:00 compute-0 ovn_controller[94909]: 2025-10-01T14:31:00Z|00234|binding|INFO|Releasing lport 3deccf94-530c-46ad-826f-fffb32b268e2 from this chassis (sb_readonly=0)
Oct 01 14:31:00 compute-0 nova_compute[192698]: 2025-10-01 14:31:00.170 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:31:00 compute-0 nova_compute[192698]: 2025-10-01 14:31:00.194 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:31:00 compute-0 nova_compute[192698]: 2025-10-01 14:31:00.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:31:00 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:31:00.200 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[303c4710-2406-495a-bb49-9da37fbd13b1]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:31:00 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:31:00.201 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ac2316c2-cb81-4558-9b5e-4a4794313854.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ac2316c2-cb81-4558-9b5e-4a4794313854.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:31:00 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:31:00.201 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ac2316c2-cb81-4558-9b5e-4a4794313854.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ac2316c2-cb81-4558-9b5e-4a4794313854.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:31:00 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:31:00.201 103791 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for ac2316c2-cb81-4558-9b5e-4a4794313854 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Oct 01 14:31:00 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:31:00.202 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ac2316c2-cb81-4558-9b5e-4a4794313854.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ac2316c2-cb81-4558-9b5e-4a4794313854.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:31:00 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:31:00.202 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[a0513e49-d9af-428f-9d9a-4cfc054545de]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:31:00 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:31:00.203 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ac2316c2-cb81-4558-9b5e-4a4794313854.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ac2316c2-cb81-4558-9b5e-4a4794313854.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:31:00 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:31:00.203 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[c7461ce8-be53-4970-9df3-81ab254c6731]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:31:00 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:31:00.204 103791 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Oct 01 14:31:00 compute-0 ovn_metadata_agent[103777]: global
Oct 01 14:31:00 compute-0 ovn_metadata_agent[103777]:     log         /dev/log local0 debug
Oct 01 14:31:00 compute-0 ovn_metadata_agent[103777]:     log-tag     haproxy-metadata-proxy-ac2316c2-cb81-4558-9b5e-4a4794313854
Oct 01 14:31:00 compute-0 ovn_metadata_agent[103777]:     user        root
Oct 01 14:31:00 compute-0 ovn_metadata_agent[103777]:     group       root
Oct 01 14:31:00 compute-0 ovn_metadata_agent[103777]:     maxconn     1024
Oct 01 14:31:00 compute-0 ovn_metadata_agent[103777]:     pidfile     /var/lib/neutron/external/pids/ac2316c2-cb81-4558-9b5e-4a4794313854.pid.haproxy
Oct 01 14:31:00 compute-0 ovn_metadata_agent[103777]:     daemon
Oct 01 14:31:00 compute-0 ovn_metadata_agent[103777]: 
Oct 01 14:31:00 compute-0 ovn_metadata_agent[103777]: defaults
Oct 01 14:31:00 compute-0 ovn_metadata_agent[103777]:     log global
Oct 01 14:31:00 compute-0 ovn_metadata_agent[103777]:     mode http
Oct 01 14:31:00 compute-0 ovn_metadata_agent[103777]:     option httplog
Oct 01 14:31:00 compute-0 ovn_metadata_agent[103777]:     option dontlognull
Oct 01 14:31:00 compute-0 ovn_metadata_agent[103777]:     option http-server-close
Oct 01 14:31:00 compute-0 ovn_metadata_agent[103777]:     option forwardfor
Oct 01 14:31:00 compute-0 ovn_metadata_agent[103777]:     retries                 3
Oct 01 14:31:00 compute-0 ovn_metadata_agent[103777]:     timeout http-request    30s
Oct 01 14:31:00 compute-0 ovn_metadata_agent[103777]:     timeout connect         30s
Oct 01 14:31:00 compute-0 ovn_metadata_agent[103777]:     timeout client          32s
Oct 01 14:31:00 compute-0 ovn_metadata_agent[103777]:     timeout server          32s
Oct 01 14:31:00 compute-0 ovn_metadata_agent[103777]:     timeout http-keep-alive 30s
Oct 01 14:31:00 compute-0 ovn_metadata_agent[103777]: 
Oct 01 14:31:00 compute-0 ovn_metadata_agent[103777]: listen listener
Oct 01 14:31:00 compute-0 ovn_metadata_agent[103777]:     bind 169.254.169.254:80
Oct 01 14:31:00 compute-0 ovn_metadata_agent[103777]:     
Oct 01 14:31:00 compute-0 ovn_metadata_agent[103777]:     server metadata /var/lib/neutron/metadata_proxy
Oct 01 14:31:00 compute-0 ovn_metadata_agent[103777]: 
Oct 01 14:31:00 compute-0 ovn_metadata_agent[103777]:     http-request add-header X-OVN-Network-ID ac2316c2-cb81-4558-9b5e-4a4794313854
Oct 01 14:31:00 compute-0 ovn_metadata_agent[103777]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Oct 01 14:31:00 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:31:00.205 103791 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ac2316c2-cb81-4558-9b5e-4a4794313854', 'env', 'PROCESS_TAG=haproxy-ac2316c2-cb81-4558-9b5e-4a4794313854', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ac2316c2-cb81-4558-9b5e-4a4794313854.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Oct 01 14:31:00 compute-0 podman[227169]: 2025-10-01 14:31:00.668767501 +0000 UTC m=+0.063339265 container create 9944d83b04be09e3d751008fcfbab5cc96a0d23cbf20af1fcae3ab1d5b93b5e3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ac2316c2-cb81-4558-9b5e-4a4794313854, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Oct 01 14:31:00 compute-0 systemd[1]: Started libpod-conmon-9944d83b04be09e3d751008fcfbab5cc96a0d23cbf20af1fcae3ab1d5b93b5e3.scope.
Oct 01 14:31:00 compute-0 systemd[1]: Started libcrun container.
Oct 01 14:31:00 compute-0 podman[227169]: 2025-10-01 14:31:00.631995492 +0000 UTC m=+0.026567486 image pull 0c139338a67144a0d88e07ef5f38b20d3085af4a1586fd8115d3776c8f9c633c 38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Oct 01 14:31:00 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2a1c0f6836b9e3c0c4f3a2f50b85afeef6ea5a990508bbe9d6549f1f75ae0914/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 01 14:31:00 compute-0 nova_compute[192698]: 2025-10-01 14:31:00.734 2 DEBUG nova.compute.manager [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] [instance: 3665e2b0-b313-4242-af04-45597829e681] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Oct 01 14:31:00 compute-0 nova_compute[192698]: 2025-10-01 14:31:00.737 2 DEBUG nova.virt.libvirt.driver [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] [instance: 3665e2b0-b313-4242-af04-45597829e681] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Oct 01 14:31:00 compute-0 nova_compute[192698]: 2025-10-01 14:31:00.741 2 INFO nova.virt.libvirt.driver [-] [instance: 3665e2b0-b313-4242-af04-45597829e681] Instance spawned successfully.
Oct 01 14:31:00 compute-0 nova_compute[192698]: 2025-10-01 14:31:00.741 2 DEBUG nova.virt.libvirt.driver [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] [instance: 3665e2b0-b313-4242-af04-45597829e681] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Oct 01 14:31:00 compute-0 podman[227169]: 2025-10-01 14:31:00.743531853 +0000 UTC m=+0.138103657 container init 9944d83b04be09e3d751008fcfbab5cc96a0d23cbf20af1fcae3ab1d5b93b5e3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ac2316c2-cb81-4558-9b5e-4a4794313854, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4)
Oct 01 14:31:00 compute-0 podman[227169]: 2025-10-01 14:31:00.752673059 +0000 UTC m=+0.147244833 container start 9944d83b04be09e3d751008fcfbab5cc96a0d23cbf20af1fcae3ab1d5b93b5e3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ac2316c2-cb81-4558-9b5e-4a4794313854, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4)
Oct 01 14:31:00 compute-0 neutron-haproxy-ovnmeta-ac2316c2-cb81-4558-9b5e-4a4794313854[227187]: [NOTICE]   (227217) : New worker (227229) forked
Oct 01 14:31:00 compute-0 neutron-haproxy-ovnmeta-ac2316c2-cb81-4558-9b5e-4a4794313854[227187]: [NOTICE]   (227217) : Loading success.
Oct 01 14:31:00 compute-0 podman[227183]: 2025-10-01 14:31:00.792171921 +0000 UTC m=+0.085587003 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930)
Oct 01 14:31:00 compute-0 podman[227186]: 2025-10-01 14:31:00.851944189 +0000 UTC m=+0.138137477 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2)
Oct 01 14:31:01 compute-0 nova_compute[192698]: 2025-10-01 14:31:01.265 2 DEBUG nova.virt.libvirt.driver [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] [instance: 3665e2b0-b313-4242-af04-45597829e681] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:31:01 compute-0 nova_compute[192698]: 2025-10-01 14:31:01.266 2 DEBUG nova.virt.libvirt.driver [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] [instance: 3665e2b0-b313-4242-af04-45597829e681] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:31:01 compute-0 nova_compute[192698]: 2025-10-01 14:31:01.267 2 DEBUG nova.virt.libvirt.driver [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] [instance: 3665e2b0-b313-4242-af04-45597829e681] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:31:01 compute-0 nova_compute[192698]: 2025-10-01 14:31:01.267 2 DEBUG nova.virt.libvirt.driver [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] [instance: 3665e2b0-b313-4242-af04-45597829e681] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:31:01 compute-0 nova_compute[192698]: 2025-10-01 14:31:01.268 2 DEBUG nova.virt.libvirt.driver [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] [instance: 3665e2b0-b313-4242-af04-45597829e681] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:31:01 compute-0 nova_compute[192698]: 2025-10-01 14:31:01.269 2 DEBUG nova.virt.libvirt.driver [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] [instance: 3665e2b0-b313-4242-af04-45597829e681] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:31:01 compute-0 openstack_network_exporter[205307]: ERROR   14:31:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:31:01 compute-0 openstack_network_exporter[205307]: ERROR   14:31:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:31:01 compute-0 openstack_network_exporter[205307]: ERROR   14:31:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:31:01 compute-0 openstack_network_exporter[205307]: ERROR   14:31:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:31:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:31:01 compute-0 openstack_network_exporter[205307]: ERROR   14:31:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:31:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:31:01 compute-0 nova_compute[192698]: 2025-10-01 14:31:01.779 2 INFO nova.compute.manager [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] [instance: 3665e2b0-b313-4242-af04-45597829e681] Took 9.96 seconds to spawn the instance on the hypervisor.
Oct 01 14:31:01 compute-0 nova_compute[192698]: 2025-10-01 14:31:01.779 2 DEBUG nova.compute.manager [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] [instance: 3665e2b0-b313-4242-af04-45597829e681] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 01 14:31:01 compute-0 nova_compute[192698]: 2025-10-01 14:31:01.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:31:02 compute-0 nova_compute[192698]: 2025-10-01 14:31:02.082 2 DEBUG nova.compute.manager [req-b05f6c25-bcfe-4b89-9b2e-126e1b62a48c req-93a68e4d-f99e-4d98-b8a1-c7b3ae0ab750 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 3665e2b0-b313-4242-af04-45597829e681] Received event network-vif-plugged-f853ffac-a897-4f2b-9131-4b4cc7ffdb18 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:31:02 compute-0 nova_compute[192698]: 2025-10-01 14:31:02.082 2 DEBUG oslo_concurrency.lockutils [req-b05f6c25-bcfe-4b89-9b2e-126e1b62a48c req-93a68e4d-f99e-4d98-b8a1-c7b3ae0ab750 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "3665e2b0-b313-4242-af04-45597829e681-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:31:02 compute-0 nova_compute[192698]: 2025-10-01 14:31:02.083 2 DEBUG oslo_concurrency.lockutils [req-b05f6c25-bcfe-4b89-9b2e-126e1b62a48c req-93a68e4d-f99e-4d98-b8a1-c7b3ae0ab750 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "3665e2b0-b313-4242-af04-45597829e681-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:31:02 compute-0 nova_compute[192698]: 2025-10-01 14:31:02.083 2 DEBUG oslo_concurrency.lockutils [req-b05f6c25-bcfe-4b89-9b2e-126e1b62a48c req-93a68e4d-f99e-4d98-b8a1-c7b3ae0ab750 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "3665e2b0-b313-4242-af04-45597829e681-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:31:02 compute-0 nova_compute[192698]: 2025-10-01 14:31:02.083 2 DEBUG nova.compute.manager [req-b05f6c25-bcfe-4b89-9b2e-126e1b62a48c req-93a68e4d-f99e-4d98-b8a1-c7b3ae0ab750 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 3665e2b0-b313-4242-af04-45597829e681] No waiting events found dispatching network-vif-plugged-f853ffac-a897-4f2b-9131-4b4cc7ffdb18 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:31:02 compute-0 nova_compute[192698]: 2025-10-01 14:31:02.084 2 WARNING nova.compute.manager [req-b05f6c25-bcfe-4b89-9b2e-126e1b62a48c req-93a68e4d-f99e-4d98-b8a1-c7b3ae0ab750 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 3665e2b0-b313-4242-af04-45597829e681] Received unexpected event network-vif-plugged-f853ffac-a897-4f2b-9131-4b4cc7ffdb18 for instance with vm_state active and task_state None.
Oct 01 14:31:02 compute-0 nova_compute[192698]: 2025-10-01 14:31:02.322 2 INFO nova.compute.manager [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] [instance: 3665e2b0-b313-4242-af04-45597829e681] Took 15.20 seconds to build instance.
Oct 01 14:31:02 compute-0 nova_compute[192698]: 2025-10-01 14:31:02.828 2 DEBUG oslo_concurrency.lockutils [None req-524373c7-bc0e-46f9-973c-59bd779228ce b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Lock "3665e2b0-b313-4242-af04-45597829e681" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.731s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:31:04 compute-0 nova_compute[192698]: 2025-10-01 14:31:04.955 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:31:06 compute-0 nova_compute[192698]: 2025-10-01 14:31:06.830 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:31:08 compute-0 podman[227245]: 2025-10-01 14:31:08.167340363 +0000 UTC m=+0.072526233 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., config_id=edpm, release=1755695350, vcs-type=git, io.buildah.version=1.33.7, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, name=ubi9-minimal)
Oct 01 14:31:09 compute-0 nova_compute[192698]: 2025-10-01 14:31:09.960 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:31:11 compute-0 nova_compute[192698]: 2025-10-01 14:31:11.834 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:31:13 compute-0 ovn_controller[94909]: 2025-10-01T14:31:13Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:8a:1c:76 10.100.0.12
Oct 01 14:31:13 compute-0 ovn_controller[94909]: 2025-10-01T14:31:13Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:8a:1c:76 10.100.0.12
Oct 01 14:31:14 compute-0 podman[227283]: 2025-10-01 14:31:14.169489405 +0000 UTC m=+0.076129619 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_managed=true)
Oct 01 14:31:14 compute-0 podman[227282]: 2025-10-01 14:31:14.181229411 +0000 UTC m=+0.084198466 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=iscsid, org.label-schema.build-date=20250930, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Oct 01 14:31:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:31:14.296 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:31:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:31:14.297 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:31:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:31:14.297 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:31:14 compute-0 nova_compute[192698]: 2025-10-01 14:31:14.961 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:31:16 compute-0 nova_compute[192698]: 2025-10-01 14:31:16.861 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:31:19 compute-0 nova_compute[192698]: 2025-10-01 14:31:19.966 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:31:20 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Oct 01 14:31:20 compute-0 podman[227324]: 2025-10-01 14:31:20.328592151 +0000 UTC m=+0.058661789 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 01 14:31:21 compute-0 nova_compute[192698]: 2025-10-01 14:31:21.865 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:31:24 compute-0 nova_compute[192698]: 2025-10-01 14:31:24.969 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:31:26 compute-0 nova_compute[192698]: 2025-10-01 14:31:26.868 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:31:29 compute-0 podman[203144]: time="2025-10-01T14:31:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:31:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:31:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20750 "" "Go-http-client/1.1"
Oct 01 14:31:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:31:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3497 "" "Go-http-client/1.1"
Oct 01 14:31:29 compute-0 nova_compute[192698]: 2025-10-01 14:31:29.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:31:29 compute-0 nova_compute[192698]: 2025-10-01 14:31:29.926 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:31:29 compute-0 nova_compute[192698]: 2025-10-01 14:31:29.926 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:31:29 compute-0 nova_compute[192698]: 2025-10-01 14:31:29.971 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:31:30 compute-0 ovn_controller[94909]: 2025-10-01T14:31:30Z|00235|memory_trim|INFO|Detected inactivity (last active 30018 ms ago): trimming memory
Oct 01 14:31:30 compute-0 nova_compute[192698]: 2025-10-01 14:31:30.441 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:31:30 compute-0 nova_compute[192698]: 2025-10-01 14:31:30.442 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:31:30 compute-0 nova_compute[192698]: 2025-10-01 14:31:30.443 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:31:30 compute-0 nova_compute[192698]: 2025-10-01 14:31:30.443 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 01 14:31:31 compute-0 podman[227349]: 2025-10-01 14:31:31.177105415 +0000 UTC m=+0.089693034 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS)
Oct 01 14:31:31 compute-0 podman[227350]: 2025-10-01 14:31:31.235520197 +0000 UTC m=+0.143836161 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4)
Oct 01 14:31:31 compute-0 openstack_network_exporter[205307]: ERROR   14:31:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:31:31 compute-0 openstack_network_exporter[205307]: ERROR   14:31:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:31:31 compute-0 openstack_network_exporter[205307]: ERROR   14:31:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:31:31 compute-0 openstack_network_exporter[205307]: ERROR   14:31:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:31:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:31:31 compute-0 openstack_network_exporter[205307]: ERROR   14:31:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:31:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:31:31 compute-0 nova_compute[192698]: 2025-10-01 14:31:31.499 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3665e2b0-b313-4242-af04-45597829e681/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:31:31 compute-0 unix_chkpwd[227396]: password check failed for user (root)
Oct 01 14:31:31 compute-0 sshd-session[227058]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=101.47.181.100  user=root
Oct 01 14:31:31 compute-0 nova_compute[192698]: 2025-10-01 14:31:31.562 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3665e2b0-b313-4242-af04-45597829e681/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:31:31 compute-0 nova_compute[192698]: 2025-10-01 14:31:31.564 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3665e2b0-b313-4242-af04-45597829e681/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:31:31 compute-0 nova_compute[192698]: 2025-10-01 14:31:31.659 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3665e2b0-b313-4242-af04-45597829e681/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:31:31 compute-0 nova_compute[192698]: 2025-10-01 14:31:31.871 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:31:31 compute-0 nova_compute[192698]: 2025-10-01 14:31:31.891 2 WARNING nova.virt.libvirt.driver [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:31:31 compute-0 nova_compute[192698]: 2025-10-01 14:31:31.893 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:31:31 compute-0 nova_compute[192698]: 2025-10-01 14:31:31.924 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.031s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:31:31 compute-0 nova_compute[192698]: 2025-10-01 14:31:31.925 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5677MB free_disk=73.27368545532227GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 01 14:31:31 compute-0 nova_compute[192698]: 2025-10-01 14:31:31.926 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:31:31 compute-0 nova_compute[192698]: 2025-10-01 14:31:31.926 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:31:32 compute-0 nova_compute[192698]: 2025-10-01 14:31:32.997 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Instance 3665e2b0-b313-4242-af04-45597829e681 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Oct 01 14:31:32 compute-0 nova_compute[192698]: 2025-10-01 14:31:32.998 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 01 14:31:32 compute-0 nova_compute[192698]: 2025-10-01 14:31:32.998 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:31:31 up  1:30,  0 user,  load average: 0.25, 0.20, 0.26\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_80bb651087894631addd91dd6ce2ecd0': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 01 14:31:33 compute-0 nova_compute[192698]: 2025-10-01 14:31:33.077 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:31:33 compute-0 nova_compute[192698]: 2025-10-01 14:31:33.586 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:31:34 compute-0 sshd-session[227058]: Failed password for root from 101.47.181.100 port 40544 ssh2
Oct 01 14:31:34 compute-0 nova_compute[192698]: 2025-10-01 14:31:34.098 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 01 14:31:34 compute-0 nova_compute[192698]: 2025-10-01 14:31:34.099 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.172s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:31:34 compute-0 sshd-session[227058]: Connection closed by authenticating user root 101.47.181.100 port 40544 [preauth]
Oct 01 14:31:34 compute-0 nova_compute[192698]: 2025-10-01 14:31:34.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:31:35 compute-0 nova_compute[192698]: 2025-10-01 14:31:35.263 2 DEBUG nova.virt.libvirt.driver [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 6a691fbf-e223-46a5-a8c8-914241ef1102] Creating tmpfile /var/lib/nova/instances/tmp_s40s6_v to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Oct 01 14:31:35 compute-0 nova_compute[192698]: 2025-10-01 14:31:35.264 2 WARNING neutronclient.v2_0.client [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:31:35 compute-0 nova_compute[192698]: 2025-10-01 14:31:35.348 2 DEBUG nova.compute.manager [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp_s40s6_v',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Oct 01 14:31:36 compute-0 nova_compute[192698]: 2025-10-01 14:31:36.098 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:31:36 compute-0 nova_compute[192698]: 2025-10-01 14:31:36.099 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:31:36 compute-0 nova_compute[192698]: 2025-10-01 14:31:36.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:31:37 compute-0 nova_compute[192698]: 2025-10-01 14:31:37.376 2 WARNING neutronclient.v2_0.client [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:31:37 compute-0 nova_compute[192698]: 2025-10-01 14:31:37.914 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:31:38 compute-0 unix_chkpwd[227405]: password check failed for user (root)
Oct 01 14:31:38 compute-0 sshd-session[227403]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=101.47.181.100  user=root
Oct 01 14:31:39 compute-0 podman[227406]: 2025-10-01 14:31:39.159409381 +0000 UTC m=+0.074790693 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.buildah.version=1.33.7, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, version=9.6, config_id=edpm, distribution-scope=public, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal)
Oct 01 14:31:39 compute-0 nova_compute[192698]: 2025-10-01 14:31:39.423 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:31:39 compute-0 nova_compute[192698]: 2025-10-01 14:31:39.924 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:31:39 compute-0 nova_compute[192698]: 2025-10-01 14:31:39.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:31:39 compute-0 nova_compute[192698]: 2025-10-01 14:31:39.925 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 01 14:31:39 compute-0 nova_compute[192698]: 2025-10-01 14:31:39.987 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:31:41 compute-0 sshd-session[227403]: Failed password for root from 101.47.181.100 port 51066 ssh2
Oct 01 14:31:41 compute-0 nova_compute[192698]: 2025-10-01 14:31:41.371 2 DEBUG nova.compute.manager [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp_s40s6_v',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='6a691fbf-e223-46a5-a8c8-914241ef1102',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Oct 01 14:31:41 compute-0 sshd-session[227403]: Connection closed by authenticating user root 101.47.181.100 port 51066 [preauth]
Oct 01 14:31:41 compute-0 nova_compute[192698]: 2025-10-01 14:31:41.876 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:31:42 compute-0 nova_compute[192698]: 2025-10-01 14:31:42.385 2 DEBUG oslo_concurrency.lockutils [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "refresh_cache-6a691fbf-e223-46a5-a8c8-914241ef1102" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 01 14:31:42 compute-0 nova_compute[192698]: 2025-10-01 14:31:42.386 2 DEBUG oslo_concurrency.lockutils [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquired lock "refresh_cache-6a691fbf-e223-46a5-a8c8-914241ef1102" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 01 14:31:42 compute-0 nova_compute[192698]: 2025-10-01 14:31:42.386 2 DEBUG nova.network.neutron [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 6a691fbf-e223-46a5-a8c8-914241ef1102] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 01 14:31:42 compute-0 nova_compute[192698]: 2025-10-01 14:31:42.893 2 WARNING neutronclient.v2_0.client [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:31:43 compute-0 unix_chkpwd[227430]: password check failed for user (root)
Oct 01 14:31:43 compute-0 sshd-session[227428]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=101.47.181.100  user=root
Oct 01 14:31:43 compute-0 nova_compute[192698]: 2025-10-01 14:31:43.888 2 WARNING neutronclient.v2_0.client [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:31:44 compute-0 nova_compute[192698]: 2025-10-01 14:31:44.721 2 DEBUG nova.network.neutron [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 6a691fbf-e223-46a5-a8c8-914241ef1102] Updating instance_info_cache with network_info: [{"id": "d5ac1caa-a00c-489e-927e-5762f26c0b4c", "address": "fa:16:3e:e0:63:05", "network": {"id": "ac2316c2-cb81-4558-9b5e-4a4794313854", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1793282763-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e30e83299c1e445dbba9473590367e5b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd5ac1caa-a0", "ovs_interfaceid": "d5ac1caa-a00c-489e-927e-5762f26c0b4c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:31:44 compute-0 nova_compute[192698]: 2025-10-01 14:31:44.989 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:31:45 compute-0 podman[227431]: 2025-10-01 14:31:45.179098647 +0000 UTC m=+0.082683505 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.build-date=20250930)
Oct 01 14:31:45 compute-0 podman[227432]: 2025-10-01 14:31:45.201440608 +0000 UTC m=+0.098390658 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, container_name=multipathd, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 01 14:31:45 compute-0 nova_compute[192698]: 2025-10-01 14:31:45.227 2 DEBUG oslo_concurrency.lockutils [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Releasing lock "refresh_cache-6a691fbf-e223-46a5-a8c8-914241ef1102" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 01 14:31:45 compute-0 nova_compute[192698]: 2025-10-01 14:31:45.242 2 DEBUG nova.virt.libvirt.driver [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 6a691fbf-e223-46a5-a8c8-914241ef1102] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp_s40s6_v',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='6a691fbf-e223-46a5-a8c8-914241ef1102',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Oct 01 14:31:45 compute-0 nova_compute[192698]: 2025-10-01 14:31:45.242 2 DEBUG nova.virt.libvirt.driver [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 6a691fbf-e223-46a5-a8c8-914241ef1102] Creating instance directory: /var/lib/nova/instances/6a691fbf-e223-46a5-a8c8-914241ef1102 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Oct 01 14:31:45 compute-0 nova_compute[192698]: 2025-10-01 14:31:45.243 2 DEBUG nova.virt.libvirt.driver [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 6a691fbf-e223-46a5-a8c8-914241ef1102] Creating disk.info with the contents: {'/var/lib/nova/instances/6a691fbf-e223-46a5-a8c8-914241ef1102/disk': 'qcow2', '/var/lib/nova/instances/6a691fbf-e223-46a5-a8c8-914241ef1102/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Oct 01 14:31:45 compute-0 nova_compute[192698]: 2025-10-01 14:31:45.243 2 DEBUG nova.virt.libvirt.driver [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 6a691fbf-e223-46a5-a8c8-914241ef1102] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Oct 01 14:31:45 compute-0 nova_compute[192698]: 2025-10-01 14:31:45.243 2 DEBUG nova.objects.instance [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 6a691fbf-e223-46a5-a8c8-914241ef1102 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:31:45 compute-0 nova_compute[192698]: 2025-10-01 14:31:45.765 2 DEBUG oslo_utils.imageutils.format_inspector [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:31:45 compute-0 nova_compute[192698]: 2025-10-01 14:31:45.769 2 DEBUG oslo_utils.imageutils.format_inspector [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:31:45 compute-0 nova_compute[192698]: 2025-10-01 14:31:45.771 2 DEBUG oslo_concurrency.processutils [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:31:45 compute-0 nova_compute[192698]: 2025-10-01 14:31:45.859 2 DEBUG oslo_concurrency.processutils [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:31:45 compute-0 nova_compute[192698]: 2025-10-01 14:31:45.860 2 DEBUG oslo_concurrency.lockutils [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "f477473ce09fdc00484ca839f539813eb2fee546" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:31:45 compute-0 nova_compute[192698]: 2025-10-01 14:31:45.861 2 DEBUG oslo_concurrency.lockutils [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "f477473ce09fdc00484ca839f539813eb2fee546" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:31:45 compute-0 nova_compute[192698]: 2025-10-01 14:31:45.862 2 DEBUG oslo_utils.imageutils.format_inspector [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:31:45 compute-0 nova_compute[192698]: 2025-10-01 14:31:45.866 2 DEBUG oslo_utils.imageutils.format_inspector [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:31:45 compute-0 nova_compute[192698]: 2025-10-01 14:31:45.866 2 DEBUG oslo_concurrency.processutils [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:31:45 compute-0 nova_compute[192698]: 2025-10-01 14:31:45.915 2 DEBUG oslo_concurrency.processutils [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:31:45 compute-0 nova_compute[192698]: 2025-10-01 14:31:45.917 2 DEBUG oslo_concurrency.processutils [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546,backing_fmt=raw /var/lib/nova/instances/6a691fbf-e223-46a5-a8c8-914241ef1102/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:31:45 compute-0 sshd-session[227428]: Failed password for root from 101.47.181.100 port 37450 ssh2
Oct 01 14:31:45 compute-0 nova_compute[192698]: 2025-10-01 14:31:45.963 2 DEBUG oslo_concurrency.processutils [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546,backing_fmt=raw /var/lib/nova/instances/6a691fbf-e223-46a5-a8c8-914241ef1102/disk 1073741824" returned: 0 in 0.046s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:31:45 compute-0 nova_compute[192698]: 2025-10-01 14:31:45.964 2 DEBUG oslo_concurrency.lockutils [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "f477473ce09fdc00484ca839f539813eb2fee546" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.103s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:31:45 compute-0 nova_compute[192698]: 2025-10-01 14:31:45.964 2 DEBUG oslo_concurrency.processutils [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:31:46 compute-0 nova_compute[192698]: 2025-10-01 14:31:46.057 2 DEBUG oslo_concurrency.processutils [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:31:46 compute-0 nova_compute[192698]: 2025-10-01 14:31:46.058 2 DEBUG nova.virt.disk.api [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Checking if we can resize image /var/lib/nova/instances/6a691fbf-e223-46a5-a8c8-914241ef1102/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 01 14:31:46 compute-0 nova_compute[192698]: 2025-10-01 14:31:46.059 2 DEBUG oslo_concurrency.processutils [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6a691fbf-e223-46a5-a8c8-914241ef1102/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:31:46 compute-0 nova_compute[192698]: 2025-10-01 14:31:46.128 2 DEBUG oslo_concurrency.processutils [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6a691fbf-e223-46a5-a8c8-914241ef1102/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:31:46 compute-0 nova_compute[192698]: 2025-10-01 14:31:46.130 2 DEBUG nova.virt.disk.api [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Cannot resize image /var/lib/nova/instances/6a691fbf-e223-46a5-a8c8-914241ef1102/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 01 14:31:46 compute-0 nova_compute[192698]: 2025-10-01 14:31:46.130 2 DEBUG nova.objects.instance [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lazy-loading 'migration_context' on Instance uuid 6a691fbf-e223-46a5-a8c8-914241ef1102 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:31:46 compute-0 nova_compute[192698]: 2025-10-01 14:31:46.639 2 DEBUG nova.objects.base [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Object Instance<6a691fbf-e223-46a5-a8c8-914241ef1102> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Oct 01 14:31:46 compute-0 nova_compute[192698]: 2025-10-01 14:31:46.640 2 DEBUG oslo_concurrency.processutils [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/6a691fbf-e223-46a5-a8c8-914241ef1102/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:31:46 compute-0 nova_compute[192698]: 2025-10-01 14:31:46.679 2 DEBUG oslo_concurrency.processutils [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/6a691fbf-e223-46a5-a8c8-914241ef1102/disk.config 497664" returned: 0 in 0.040s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:31:46 compute-0 nova_compute[192698]: 2025-10-01 14:31:46.680 2 DEBUG nova.virt.libvirt.driver [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 6a691fbf-e223-46a5-a8c8-914241ef1102] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Oct 01 14:31:46 compute-0 nova_compute[192698]: 2025-10-01 14:31:46.682 2 DEBUG nova.virt.libvirt.vif [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-10-01T14:30:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-548156839',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-548156839',id=28,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-01T14:30:41Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='80bb651087894631addd91dd6ce2ecd0',ramdisk_id='',reservation_id='r-fj040vhh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1927341926',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1927341926-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-10-01T14:30:42Z,user_data=None,user_id='b71a58b28129460f94de238eedc8965c',uuid=6a691fbf-e223-46a5-a8c8-914241ef1102,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d5ac1caa-a00c-489e-927e-5762f26c0b4c", "address": "fa:16:3e:e0:63:05", "network": {"id": "ac2316c2-cb81-4558-9b5e-4a4794313854", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1793282763-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e30e83299c1e445dbba9473590367e5b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapd5ac1caa-a0", "ovs_interfaceid": "d5ac1caa-a00c-489e-927e-5762f26c0b4c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 01 14:31:46 compute-0 nova_compute[192698]: 2025-10-01 14:31:46.682 2 DEBUG nova.network.os_vif_util [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Converting VIF {"id": "d5ac1caa-a00c-489e-927e-5762f26c0b4c", "address": "fa:16:3e:e0:63:05", "network": {"id": "ac2316c2-cb81-4558-9b5e-4a4794313854", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1793282763-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e30e83299c1e445dbba9473590367e5b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapd5ac1caa-a0", "ovs_interfaceid": "d5ac1caa-a00c-489e-927e-5762f26c0b4c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:31:46 compute-0 nova_compute[192698]: 2025-10-01 14:31:46.683 2 DEBUG nova.network.os_vif_util [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e0:63:05,bridge_name='br-int',has_traffic_filtering=True,id=d5ac1caa-a00c-489e-927e-5762f26c0b4c,network=Network(ac2316c2-cb81-4558-9b5e-4a4794313854),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd5ac1caa-a0') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:31:46 compute-0 nova_compute[192698]: 2025-10-01 14:31:46.683 2 DEBUG os_vif [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e0:63:05,bridge_name='br-int',has_traffic_filtering=True,id=d5ac1caa-a00c-489e-927e-5762f26c0b4c,network=Network(ac2316c2-cb81-4558-9b5e-4a4794313854),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd5ac1caa-a0') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 01 14:31:46 compute-0 nova_compute[192698]: 2025-10-01 14:31:46.684 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:31:46 compute-0 nova_compute[192698]: 2025-10-01 14:31:46.685 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:31:46 compute-0 nova_compute[192698]: 2025-10-01 14:31:46.685 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:31:46 compute-0 nova_compute[192698]: 2025-10-01 14:31:46.686 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:31:46 compute-0 nova_compute[192698]: 2025-10-01 14:31:46.686 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '509156d7-38bb-5e3b-980b-e2c5cf1f5707', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:31:46 compute-0 nova_compute[192698]: 2025-10-01 14:31:46.690 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:31:46 compute-0 nova_compute[192698]: 2025-10-01 14:31:46.695 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:31:46 compute-0 nova_compute[192698]: 2025-10-01 14:31:46.695 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd5ac1caa-a0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:31:46 compute-0 nova_compute[192698]: 2025-10-01 14:31:46.695 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapd5ac1caa-a0, col_values=(('qos', UUID('4e7266fd-c424-47e1-9bfc-989c7f280740')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:31:46 compute-0 nova_compute[192698]: 2025-10-01 14:31:46.696 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapd5ac1caa-a0, col_values=(('external_ids', {'iface-id': 'd5ac1caa-a00c-489e-927e-5762f26c0b4c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e0:63:05', 'vm-uuid': '6a691fbf-e223-46a5-a8c8-914241ef1102'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:31:46 compute-0 nova_compute[192698]: 2025-10-01 14:31:46.697 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:31:46 compute-0 NetworkManager[51741]: <info>  [1759329106.6983] manager: (tapd5ac1caa-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/90)
Oct 01 14:31:46 compute-0 nova_compute[192698]: 2025-10-01 14:31:46.700 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:31:46 compute-0 sshd-session[227428]: Connection closed by authenticating user root 101.47.181.100 port 37450 [preauth]
Oct 01 14:31:46 compute-0 nova_compute[192698]: 2025-10-01 14:31:46.708 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:31:46 compute-0 nova_compute[192698]: 2025-10-01 14:31:46.709 2 INFO os_vif [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e0:63:05,bridge_name='br-int',has_traffic_filtering=True,id=d5ac1caa-a00c-489e-927e-5762f26c0b4c,network=Network(ac2316c2-cb81-4558-9b5e-4a4794313854),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd5ac1caa-a0')
Oct 01 14:31:46 compute-0 nova_compute[192698]: 2025-10-01 14:31:46.709 2 DEBUG nova.virt.libvirt.driver [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Oct 01 14:31:46 compute-0 nova_compute[192698]: 2025-10-01 14:31:46.710 2 DEBUG nova.compute.manager [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp_s40s6_v',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='6a691fbf-e223-46a5-a8c8-914241ef1102',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Oct 01 14:31:46 compute-0 nova_compute[192698]: 2025-10-01 14:31:46.711 2 WARNING neutronclient.v2_0.client [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:31:46 compute-0 nova_compute[192698]: 2025-10-01 14:31:46.807 2 WARNING neutronclient.v2_0.client [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:31:47 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:31:47.081 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=29, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'e2:3f:3c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:1d:a6:67:ed:e6'}, ipsec=False) old=SB_Global(nb_cfg=28) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:31:47 compute-0 nova_compute[192698]: 2025-10-01 14:31:47.081 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:31:47 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:31:47.082 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 01 14:31:47 compute-0 nova_compute[192698]: 2025-10-01 14:31:47.356 2 DEBUG nova.network.neutron [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 6a691fbf-e223-46a5-a8c8-914241ef1102] Port d5ac1caa-a00c-489e-927e-5762f26c0b4c updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Oct 01 14:31:47 compute-0 nova_compute[192698]: 2025-10-01 14:31:47.376 2 DEBUG nova.compute.manager [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp_s40s6_v',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='6a691fbf-e223-46a5-a8c8-914241ef1102',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Oct 01 14:31:49 compute-0 unix_chkpwd[227494]: password check failed for user (root)
Oct 01 14:31:49 compute-0 sshd-session[227491]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=101.47.181.100  user=root
Oct 01 14:31:50 compute-0 nova_compute[192698]: 2025-10-01 14:31:50.000 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:31:50 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:31:50.084 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=10cf9814-09fa-4bad-879a-270f9b64eda3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '29'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:31:50 compute-0 systemd[1]: Starting libvirt proxy daemon...
Oct 01 14:31:50 compute-0 systemd[1]: Started libvirt proxy daemon.
Oct 01 14:31:50 compute-0 podman[227495]: 2025-10-01 14:31:50.864525641 +0000 UTC m=+0.081025381 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 01 14:31:50 compute-0 kernel: tapd5ac1caa-a0: entered promiscuous mode
Oct 01 14:31:50 compute-0 NetworkManager[51741]: <info>  [1759329110.9624] manager: (tapd5ac1caa-a0): new Tun device (/org/freedesktop/NetworkManager/Devices/91)
Oct 01 14:31:50 compute-0 ovn_controller[94909]: 2025-10-01T14:31:50Z|00236|binding|INFO|Claiming lport d5ac1caa-a00c-489e-927e-5762f26c0b4c for this additional chassis.
Oct 01 14:31:50 compute-0 ovn_controller[94909]: 2025-10-01T14:31:50Z|00237|binding|INFO|d5ac1caa-a00c-489e-927e-5762f26c0b4c: Claiming fa:16:3e:e0:63:05 10.100.0.3
Oct 01 14:31:50 compute-0 nova_compute[192698]: 2025-10-01 14:31:50.964 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:31:50 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:31:50.971 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e0:63:05 10.100.0.3'], port_security=['fa:16:3e:e0:63:05 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '6a691fbf-e223-46a5-a8c8-914241ef1102', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ac2316c2-cb81-4558-9b5e-4a4794313854', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '80bb651087894631addd91dd6ce2ecd0', 'neutron:revision_number': '10', 'neutron:security_group_ids': '89f6228c-210a-405a-bc96-86ab494387a8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b9cd0b76-6dc9-458b-82d5-27e9ccc0503c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=d5ac1caa-a00c-489e-927e-5762f26c0b4c) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:31:50 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:31:50.973 103791 INFO neutron.agent.ovn.metadata.agent [-] Port d5ac1caa-a00c-489e-927e-5762f26c0b4c in datapath ac2316c2-cb81-4558-9b5e-4a4794313854 unbound from our chassis
Oct 01 14:31:50 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:31:50.974 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ac2316c2-cb81-4558-9b5e-4a4794313854
Oct 01 14:31:50 compute-0 ovn_controller[94909]: 2025-10-01T14:31:50Z|00238|binding|INFO|Setting lport d5ac1caa-a00c-489e-927e-5762f26c0b4c ovn-installed in OVS
Oct 01 14:31:50 compute-0 nova_compute[192698]: 2025-10-01 14:31:50.982 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:31:50 compute-0 nova_compute[192698]: 2025-10-01 14:31:50.987 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:31:51 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:31:51.000 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[d9ba006f-8996-4ecc-b30e-52deae07d1d4]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:31:51 compute-0 systemd-udevd[227551]: Network interface NamePolicy= disabled on kernel command line.
Oct 01 14:31:51 compute-0 systemd-machined[152704]: New machine qemu-23-instance-0000001c.
Oct 01 14:31:51 compute-0 systemd[1]: Started Virtual Machine qemu-23-instance-0000001c.
Oct 01 14:31:51 compute-0 NetworkManager[51741]: <info>  [1759329111.0307] device (tapd5ac1caa-a0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 01 14:31:51 compute-0 NetworkManager[51741]: <info>  [1759329111.0315] device (tapd5ac1caa-a0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 01 14:31:51 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:31:51.043 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[d2864f34-aaf8-4a9c-be9f-59ed3b050e5b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:31:51 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:31:51.047 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[0263e124-5a9b-44c2-9dfb-5f06240274ab]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:31:51 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:31:51.086 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[6709beab-5bd2-4b80-8b32-679cd04ab76f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:31:51 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:31:51.107 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[4336d9b7-97fe-47a1-99b2-1a5757826f4e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapac2316c2-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2f:b7:30'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 6, 'rx_bytes': 916, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 6, 'rx_bytes': 916, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 67], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 542046, 'reachable_time': 18268, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227564, 'error': None, 'target': 'ovnmeta-ac2316c2-cb81-4558-9b5e-4a4794313854', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:31:51 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:31:51.123 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[1df24bce-bc67-487e-bfbd-ed60bc582720]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapac2316c2-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 542063, 'tstamp': 542063}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227565, 'error': None, 'target': 'ovnmeta-ac2316c2-cb81-4558-9b5e-4a4794313854', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapac2316c2-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 542068, 'tstamp': 542068}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227565, 'error': None, 'target': 'ovnmeta-ac2316c2-cb81-4558-9b5e-4a4794313854', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:31:51 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:31:51.125 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapac2316c2-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:31:51 compute-0 nova_compute[192698]: 2025-10-01 14:31:51.126 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:31:51 compute-0 nova_compute[192698]: 2025-10-01 14:31:51.128 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:31:51 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:31:51.128 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapac2316c2-c0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:31:51 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:31:51.128 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:31:51 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:31:51.129 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapac2316c2-c0, col_values=(('external_ids', {'iface-id': '3deccf94-530c-46ad-826f-fffb32b268e2'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:31:51 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:31:51.129 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:31:51 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:31:51.131 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[3f1d81e0-3b67-405b-8ea5-0dba6931d8c7]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-ac2316c2-cb81-4558-9b5e-4a4794313854\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/ac2316c2-cb81-4558-9b5e-4a4794313854.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID ac2316c2-cb81-4558-9b5e-4a4794313854\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:31:51 compute-0 nova_compute[192698]: 2025-10-01 14:31:51.697 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:31:51 compute-0 sshd-session[227491]: Failed password for root from 101.47.181.100 port 58786 ssh2
Oct 01 14:31:52 compute-0 sshd-session[227491]: Connection closed by authenticating user root 101.47.181.100 port 58786 [preauth]
Oct 01 14:31:54 compute-0 ovn_controller[94909]: 2025-10-01T14:31:54Z|00239|binding|INFO|Claiming lport d5ac1caa-a00c-489e-927e-5762f26c0b4c for this chassis.
Oct 01 14:31:54 compute-0 ovn_controller[94909]: 2025-10-01T14:31:54Z|00240|binding|INFO|d5ac1caa-a00c-489e-927e-5762f26c0b4c: Claiming fa:16:3e:e0:63:05 10.100.0.3
Oct 01 14:31:54 compute-0 ovn_controller[94909]: 2025-10-01T14:31:54Z|00241|binding|INFO|Setting lport d5ac1caa-a00c-489e-927e-5762f26c0b4c up in Southbound
Oct 01 14:31:55 compute-0 nova_compute[192698]: 2025-10-01 14:31:55.004 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:31:56 compute-0 nova_compute[192698]: 2025-10-01 14:31:56.167 2 INFO nova.compute.manager [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 6a691fbf-e223-46a5-a8c8-914241ef1102] Post operation of migration started
Oct 01 14:31:56 compute-0 nova_compute[192698]: 2025-10-01 14:31:56.168 2 WARNING neutronclient.v2_0.client [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:31:56 compute-0 unix_chkpwd[227589]: password check failed for user (root)
Oct 01 14:31:56 compute-0 sshd-session[227574]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=101.47.181.100  user=root
Oct 01 14:31:56 compute-0 nova_compute[192698]: 2025-10-01 14:31:56.575 2 WARNING neutronclient.v2_0.client [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:31:56 compute-0 nova_compute[192698]: 2025-10-01 14:31:56.575 2 WARNING neutronclient.v2_0.client [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:31:56 compute-0 nova_compute[192698]: 2025-10-01 14:31:56.700 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:31:56 compute-0 nova_compute[192698]: 2025-10-01 14:31:56.704 2 DEBUG oslo_concurrency.lockutils [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "refresh_cache-6a691fbf-e223-46a5-a8c8-914241ef1102" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 01 14:31:56 compute-0 nova_compute[192698]: 2025-10-01 14:31:56.705 2 DEBUG oslo_concurrency.lockutils [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquired lock "refresh_cache-6a691fbf-e223-46a5-a8c8-914241ef1102" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 01 14:31:56 compute-0 nova_compute[192698]: 2025-10-01 14:31:56.705 2 DEBUG nova.network.neutron [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 6a691fbf-e223-46a5-a8c8-914241ef1102] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 01 14:31:57 compute-0 nova_compute[192698]: 2025-10-01 14:31:57.217 2 WARNING neutronclient.v2_0.client [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:31:57 compute-0 nova_compute[192698]: 2025-10-01 14:31:57.937 2 WARNING neutronclient.v2_0.client [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:31:58 compute-0 nova_compute[192698]: 2025-10-01 14:31:58.169 2 DEBUG nova.network.neutron [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 6a691fbf-e223-46a5-a8c8-914241ef1102] Updating instance_info_cache with network_info: [{"id": "d5ac1caa-a00c-489e-927e-5762f26c0b4c", "address": "fa:16:3e:e0:63:05", "network": {"id": "ac2316c2-cb81-4558-9b5e-4a4794313854", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1793282763-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e30e83299c1e445dbba9473590367e5b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd5ac1caa-a0", "ovs_interfaceid": "d5ac1caa-a00c-489e-927e-5762f26c0b4c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:31:58 compute-0 sshd-session[227574]: Failed password for root from 101.47.181.100 port 58790 ssh2
Oct 01 14:31:58 compute-0 nova_compute[192698]: 2025-10-01 14:31:58.677 2 DEBUG oslo_concurrency.lockutils [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Releasing lock "refresh_cache-6a691fbf-e223-46a5-a8c8-914241ef1102" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 01 14:31:59 compute-0 nova_compute[192698]: 2025-10-01 14:31:59.202 2 DEBUG oslo_concurrency.lockutils [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:31:59 compute-0 nova_compute[192698]: 2025-10-01 14:31:59.203 2 DEBUG oslo_concurrency.lockutils [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:31:59 compute-0 nova_compute[192698]: 2025-10-01 14:31:59.203 2 DEBUG oslo_concurrency.lockutils [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:31:59 compute-0 nova_compute[192698]: 2025-10-01 14:31:59.208 2 INFO nova.virt.libvirt.driver [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 6a691fbf-e223-46a5-a8c8-914241ef1102] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Oct 01 14:31:59 compute-0 virtqemud[192597]: Domain id=23 name='instance-0000001c' uuid=6a691fbf-e223-46a5-a8c8-914241ef1102 is tainted: custom-monitor
Oct 01 14:31:59 compute-0 podman[203144]: time="2025-10-01T14:31:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:31:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:31:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20750 "" "Go-http-client/1.1"
Oct 01 14:31:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:31:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3494 "" "Go-http-client/1.1"
Oct 01 14:32:00 compute-0 nova_compute[192698]: 2025-10-01 14:32:00.007 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:32:00 compute-0 nova_compute[192698]: 2025-10-01 14:32:00.216 2 INFO nova.virt.libvirt.driver [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 6a691fbf-e223-46a5-a8c8-914241ef1102] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Oct 01 14:32:01 compute-0 nova_compute[192698]: 2025-10-01 14:32:01.225 2 INFO nova.virt.libvirt.driver [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 6a691fbf-e223-46a5-a8c8-914241ef1102] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Oct 01 14:32:01 compute-0 nova_compute[192698]: 2025-10-01 14:32:01.229 2 DEBUG nova.compute.manager [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 6a691fbf-e223-46a5-a8c8-914241ef1102] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 01 14:32:01 compute-0 openstack_network_exporter[205307]: ERROR   14:32:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:32:01 compute-0 openstack_network_exporter[205307]: ERROR   14:32:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:32:01 compute-0 openstack_network_exporter[205307]: ERROR   14:32:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:32:01 compute-0 openstack_network_exporter[205307]: ERROR   14:32:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:32:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:32:01 compute-0 openstack_network_exporter[205307]: ERROR   14:32:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:32:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:32:01 compute-0 sshd-session[227574]: Connection closed by authenticating user root 101.47.181.100 port 58790 [preauth]
Oct 01 14:32:01 compute-0 nova_compute[192698]: 2025-10-01 14:32:01.701 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:32:01 compute-0 nova_compute[192698]: 2025-10-01 14:32:01.738 2 DEBUG nova.objects.instance [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 6a691fbf-e223-46a5-a8c8-914241ef1102] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Oct 01 14:32:02 compute-0 unix_chkpwd[227606]: password check failed for user (root)
Oct 01 14:32:02 compute-0 sshd-session[227590]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=101.47.181.100  user=root
Oct 01 14:32:02 compute-0 podman[227592]: 2025-10-01 14:32:02.18722173 +0000 UTC m=+0.082459409 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20250930, container_name=ovn_metadata_agent, io.buildah.version=1.41.4)
Oct 01 14:32:02 compute-0 podman[227593]: 2025-10-01 14:32:02.21882042 +0000 UTC m=+0.118663893 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4)
Oct 01 14:32:02 compute-0 nova_compute[192698]: 2025-10-01 14:32:02.759 2 WARNING neutronclient.v2_0.client [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:32:03 compute-0 nova_compute[192698]: 2025-10-01 14:32:03.578 2 WARNING neutronclient.v2_0.client [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:32:03 compute-0 nova_compute[192698]: 2025-10-01 14:32:03.579 2 WARNING neutronclient.v2_0.client [None req-c10834bf-2d69-4187-9d55-e4441e1a6793 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:32:04 compute-0 sshd-session[227590]: Failed password for root from 101.47.181.100 port 57350 ssh2
Oct 01 14:32:05 compute-0 nova_compute[192698]: 2025-10-01 14:32:05.009 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:32:05 compute-0 nova_compute[192698]: 2025-10-01 14:32:05.496 2 DEBUG oslo_concurrency.lockutils [None req-34463f94-b108-41d6-95fc-440dd964b668 b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Acquiring lock "3665e2b0-b313-4242-af04-45597829e681" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:32:05 compute-0 nova_compute[192698]: 2025-10-01 14:32:05.497 2 DEBUG oslo_concurrency.lockutils [None req-34463f94-b108-41d6-95fc-440dd964b668 b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Lock "3665e2b0-b313-4242-af04-45597829e681" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:32:05 compute-0 nova_compute[192698]: 2025-10-01 14:32:05.498 2 DEBUG oslo_concurrency.lockutils [None req-34463f94-b108-41d6-95fc-440dd964b668 b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Acquiring lock "3665e2b0-b313-4242-af04-45597829e681-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:32:05 compute-0 nova_compute[192698]: 2025-10-01 14:32:05.499 2 DEBUG oslo_concurrency.lockutils [None req-34463f94-b108-41d6-95fc-440dd964b668 b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Lock "3665e2b0-b313-4242-af04-45597829e681-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:32:05 compute-0 nova_compute[192698]: 2025-10-01 14:32:05.499 2 DEBUG oslo_concurrency.lockutils [None req-34463f94-b108-41d6-95fc-440dd964b668 b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Lock "3665e2b0-b313-4242-af04-45597829e681-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:32:05 compute-0 nova_compute[192698]: 2025-10-01 14:32:05.519 2 INFO nova.compute.manager [None req-34463f94-b108-41d6-95fc-440dd964b668 b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] [instance: 3665e2b0-b313-4242-af04-45597829e681] Terminating instance
Oct 01 14:32:06 compute-0 nova_compute[192698]: 2025-10-01 14:32:06.040 2 DEBUG nova.compute.manager [None req-34463f94-b108-41d6-95fc-440dd964b668 b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] [instance: 3665e2b0-b313-4242-af04-45597829e681] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 01 14:32:06 compute-0 kernel: tapf853ffac-a8 (unregistering): left promiscuous mode
Oct 01 14:32:06 compute-0 NetworkManager[51741]: <info>  [1759329126.0753] device (tapf853ffac-a8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 01 14:32:06 compute-0 ovn_controller[94909]: 2025-10-01T14:32:06Z|00242|binding|INFO|Releasing lport f853ffac-a897-4f2b-9131-4b4cc7ffdb18 from this chassis (sb_readonly=0)
Oct 01 14:32:06 compute-0 ovn_controller[94909]: 2025-10-01T14:32:06Z|00243|binding|INFO|Setting lport f853ffac-a897-4f2b-9131-4b4cc7ffdb18 down in Southbound
Oct 01 14:32:06 compute-0 ovn_controller[94909]: 2025-10-01T14:32:06Z|00244|binding|INFO|Removing iface tapf853ffac-a8 ovn-installed in OVS
Oct 01 14:32:06 compute-0 nova_compute[192698]: 2025-10-01 14:32:06.094 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:32:06 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:32:06.106 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8a:1c:76 10.100.0.12'], port_security=['fa:16:3e:8a:1c:76 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '3665e2b0-b313-4242-af04-45597829e681', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ac2316c2-cb81-4558-9b5e-4a4794313854', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '80bb651087894631addd91dd6ce2ecd0', 'neutron:revision_number': '5', 'neutron:security_group_ids': '89f6228c-210a-405a-bc96-86ab494387a8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b9cd0b76-6dc9-458b-82d5-27e9ccc0503c, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], logical_port=f853ffac-a897-4f2b-9131-4b4cc7ffdb18) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:32:06 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:32:06.107 103791 INFO neutron.agent.ovn.metadata.agent [-] Port f853ffac-a897-4f2b-9131-4b4cc7ffdb18 in datapath ac2316c2-cb81-4558-9b5e-4a4794313854 unbound from our chassis
Oct 01 14:32:06 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:32:06.109 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ac2316c2-cb81-4558-9b5e-4a4794313854
Oct 01 14:32:06 compute-0 nova_compute[192698]: 2025-10-01 14:32:06.115 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:32:06 compute-0 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d0000001d.scope: Deactivated successfully.
Oct 01 14:32:06 compute-0 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d0000001d.scope: Consumed 15.773s CPU time.
Oct 01 14:32:06 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:32:06.142 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[b53b0776-1b1f-4a7f-a214-fdc84db60854]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:32:06 compute-0 systemd-machined[152704]: Machine qemu-22-instance-0000001d terminated.
Oct 01 14:32:06 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:32:06.187 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[47b69c8a-55c7-48d2-b466-1ebe1250e9cf]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:32:06 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:32:06.190 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[a3d7c994-202c-4c5c-82b9-f05594b8051a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:32:06 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:32:06.228 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[3e43fe5f-c68b-4f0a-a631-b9c586d49487]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:32:06 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:32:06.258 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[8367cc4e-60a4-4449-9467-40d2a0d21e0f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapac2316c2-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2f:b7:30'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 30, 'tx_packets': 8, 'rx_bytes': 1756, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 30, 'tx_packets': 8, 'rx_bytes': 1756, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 67], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 542046, 'reachable_time': 18268, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227647, 'error': None, 'target': 'ovnmeta-ac2316c2-cb81-4558-9b5e-4a4794313854', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:32:06 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:32:06.290 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[77d2b416-7fc5-49c4-9af6-d994b9976cb9]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapac2316c2-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 542063, 'tstamp': 542063}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227650, 'error': None, 'target': 'ovnmeta-ac2316c2-cb81-4558-9b5e-4a4794313854', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapac2316c2-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 542068, 'tstamp': 542068}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227650, 'error': None, 'target': 'ovnmeta-ac2316c2-cb81-4558-9b5e-4a4794313854', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:32:06 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:32:06.292 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapac2316c2-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:32:06 compute-0 nova_compute[192698]: 2025-10-01 14:32:06.295 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:32:06 compute-0 nova_compute[192698]: 2025-10-01 14:32:06.303 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:32:06 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:32:06.304 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapac2316c2-c0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:32:06 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:32:06.305 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:32:06 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:32:06.305 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapac2316c2-c0, col_values=(('external_ids', {'iface-id': '3deccf94-530c-46ad-826f-fffb32b268e2'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:32:06 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:32:06.306 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:32:06 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:32:06.308 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[e241f0bc-5141-4e99-80c6-31f082eafc63]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-ac2316c2-cb81-4558-9b5e-4a4794313854\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/ac2316c2-cb81-4558-9b5e-4a4794313854.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID ac2316c2-cb81-4558-9b5e-4a4794313854\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:32:06 compute-0 nova_compute[192698]: 2025-10-01 14:32:06.347 2 INFO nova.virt.libvirt.driver [-] [instance: 3665e2b0-b313-4242-af04-45597829e681] Instance destroyed successfully.
Oct 01 14:32:06 compute-0 nova_compute[192698]: 2025-10-01 14:32:06.348 2 DEBUG nova.objects.instance [None req-34463f94-b108-41d6-95fc-440dd964b668 b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Lazy-loading 'resources' on Instance uuid 3665e2b0-b313-4242-af04-45597829e681 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:32:06 compute-0 nova_compute[192698]: 2025-10-01 14:32:06.691 2 DEBUG nova.compute.manager [req-c958698b-dc58-4f6a-afe6-5f31d13e92bc req-40112f23-8186-4bd8-88e2-69a5f7eb9356 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 3665e2b0-b313-4242-af04-45597829e681] Received event network-vif-unplugged-f853ffac-a897-4f2b-9131-4b4cc7ffdb18 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:32:06 compute-0 nova_compute[192698]: 2025-10-01 14:32:06.691 2 DEBUG oslo_concurrency.lockutils [req-c958698b-dc58-4f6a-afe6-5f31d13e92bc req-40112f23-8186-4bd8-88e2-69a5f7eb9356 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "3665e2b0-b313-4242-af04-45597829e681-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:32:06 compute-0 nova_compute[192698]: 2025-10-01 14:32:06.691 2 DEBUG oslo_concurrency.lockutils [req-c958698b-dc58-4f6a-afe6-5f31d13e92bc req-40112f23-8186-4bd8-88e2-69a5f7eb9356 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "3665e2b0-b313-4242-af04-45597829e681-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:32:06 compute-0 nova_compute[192698]: 2025-10-01 14:32:06.692 2 DEBUG oslo_concurrency.lockutils [req-c958698b-dc58-4f6a-afe6-5f31d13e92bc req-40112f23-8186-4bd8-88e2-69a5f7eb9356 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "3665e2b0-b313-4242-af04-45597829e681-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:32:06 compute-0 nova_compute[192698]: 2025-10-01 14:32:06.692 2 DEBUG nova.compute.manager [req-c958698b-dc58-4f6a-afe6-5f31d13e92bc req-40112f23-8186-4bd8-88e2-69a5f7eb9356 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 3665e2b0-b313-4242-af04-45597829e681] No waiting events found dispatching network-vif-unplugged-f853ffac-a897-4f2b-9131-4b4cc7ffdb18 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:32:06 compute-0 nova_compute[192698]: 2025-10-01 14:32:06.692 2 DEBUG nova.compute.manager [req-c958698b-dc58-4f6a-afe6-5f31d13e92bc req-40112f23-8186-4bd8-88e2-69a5f7eb9356 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 3665e2b0-b313-4242-af04-45597829e681] Received event network-vif-unplugged-f853ffac-a897-4f2b-9131-4b4cc7ffdb18 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 01 14:32:06 compute-0 nova_compute[192698]: 2025-10-01 14:32:06.704 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:32:06 compute-0 nova_compute[192698]: 2025-10-01 14:32:06.857 2 DEBUG nova.virt.libvirt.vif [None req-34463f94-b108-41d6-95fc-440dd964b668 b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-10-01T14:30:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-344748452',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-344748452',id=29,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-01T14:31:01Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='80bb651087894631addd91dd6ce2ecd0',ramdisk_id='',reservation_id='r-ep8em9s8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1927341926',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1927341926-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-01T14:31:01Z,user_data=None,user_id='b71a58b28129460f94de238eedc8965c',uuid=3665e2b0-b313-4242-af04-45597829e681,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f853ffac-a897-4f2b-9131-4b4cc7ffdb18", "address": "fa:16:3e:8a:1c:76", "network": {"id": "ac2316c2-cb81-4558-9b5e-4a4794313854", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1793282763-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e30e83299c1e445dbba9473590367e5b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf853ffac-a8", "ovs_interfaceid": "f853ffac-a897-4f2b-9131-4b4cc7ffdb18", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 01 14:32:06 compute-0 nova_compute[192698]: 2025-10-01 14:32:06.858 2 DEBUG nova.network.os_vif_util [None req-34463f94-b108-41d6-95fc-440dd964b668 b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Converting VIF {"id": "f853ffac-a897-4f2b-9131-4b4cc7ffdb18", "address": "fa:16:3e:8a:1c:76", "network": {"id": "ac2316c2-cb81-4558-9b5e-4a4794313854", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1793282763-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e30e83299c1e445dbba9473590367e5b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf853ffac-a8", "ovs_interfaceid": "f853ffac-a897-4f2b-9131-4b4cc7ffdb18", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:32:06 compute-0 nova_compute[192698]: 2025-10-01 14:32:06.859 2 DEBUG nova.network.os_vif_util [None req-34463f94-b108-41d6-95fc-440dd964b668 b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8a:1c:76,bridge_name='br-int',has_traffic_filtering=True,id=f853ffac-a897-4f2b-9131-4b4cc7ffdb18,network=Network(ac2316c2-cb81-4558-9b5e-4a4794313854),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf853ffac-a8') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:32:06 compute-0 nova_compute[192698]: 2025-10-01 14:32:06.860 2 DEBUG os_vif [None req-34463f94-b108-41d6-95fc-440dd964b668 b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8a:1c:76,bridge_name='br-int',has_traffic_filtering=True,id=f853ffac-a897-4f2b-9131-4b4cc7ffdb18,network=Network(ac2316c2-cb81-4558-9b5e-4a4794313854),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf853ffac-a8') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 01 14:32:06 compute-0 nova_compute[192698]: 2025-10-01 14:32:06.863 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:32:06 compute-0 nova_compute[192698]: 2025-10-01 14:32:06.863 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf853ffac-a8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:32:06 compute-0 nova_compute[192698]: 2025-10-01 14:32:06.866 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:32:06 compute-0 nova_compute[192698]: 2025-10-01 14:32:06.868 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:32:06 compute-0 nova_compute[192698]: 2025-10-01 14:32:06.870 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:32:06 compute-0 nova_compute[192698]: 2025-10-01 14:32:06.871 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=3954863e-2cb7-4f97-bf36-21ae02aa1c21) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:32:06 compute-0 nova_compute[192698]: 2025-10-01 14:32:06.872 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:32:06 compute-0 nova_compute[192698]: 2025-10-01 14:32:06.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:32:06 compute-0 nova_compute[192698]: 2025-10-01 14:32:06.878 2 INFO os_vif [None req-34463f94-b108-41d6-95fc-440dd964b668 b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8a:1c:76,bridge_name='br-int',has_traffic_filtering=True,id=f853ffac-a897-4f2b-9131-4b4cc7ffdb18,network=Network(ac2316c2-cb81-4558-9b5e-4a4794313854),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf853ffac-a8')
Oct 01 14:32:06 compute-0 nova_compute[192698]: 2025-10-01 14:32:06.879 2 INFO nova.virt.libvirt.driver [None req-34463f94-b108-41d6-95fc-440dd964b668 b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] [instance: 3665e2b0-b313-4242-af04-45597829e681] Deleting instance files /var/lib/nova/instances/3665e2b0-b313-4242-af04-45597829e681_del
Oct 01 14:32:06 compute-0 nova_compute[192698]: 2025-10-01 14:32:06.881 2 INFO nova.virt.libvirt.driver [None req-34463f94-b108-41d6-95fc-440dd964b668 b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] [instance: 3665e2b0-b313-4242-af04-45597829e681] Deletion of /var/lib/nova/instances/3665e2b0-b313-4242-af04-45597829e681_del complete
Oct 01 14:32:07 compute-0 nova_compute[192698]: 2025-10-01 14:32:07.404 2 INFO nova.compute.manager [None req-34463f94-b108-41d6-95fc-440dd964b668 b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] [instance: 3665e2b0-b313-4242-af04-45597829e681] Took 1.36 seconds to destroy the instance on the hypervisor.
Oct 01 14:32:07 compute-0 nova_compute[192698]: 2025-10-01 14:32:07.405 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-34463f94-b108-41d6-95fc-440dd964b668 b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 01 14:32:07 compute-0 nova_compute[192698]: 2025-10-01 14:32:07.406 2 DEBUG nova.compute.manager [-] [instance: 3665e2b0-b313-4242-af04-45597829e681] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 01 14:32:07 compute-0 nova_compute[192698]: 2025-10-01 14:32:07.406 2 DEBUG nova.network.neutron [-] [instance: 3665e2b0-b313-4242-af04-45597829e681] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 01 14:32:07 compute-0 nova_compute[192698]: 2025-10-01 14:32:07.406 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:32:07 compute-0 nova_compute[192698]: 2025-10-01 14:32:07.586 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:32:07 compute-0 sshd-session[227590]: Connection closed by authenticating user root 101.47.181.100 port 57350 [preauth]
Oct 01 14:32:08 compute-0 nova_compute[192698]: 2025-10-01 14:32:08.408 2 DEBUG nova.network.neutron [-] [instance: 3665e2b0-b313-4242-af04-45597829e681] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:32:08 compute-0 nova_compute[192698]: 2025-10-01 14:32:08.777 2 DEBUG nova.compute.manager [req-617d9405-06a7-414b-b089-4997d2e07275 req-fa5c45b6-9dde-4c74-bc8c-84d1a9e0c45d 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 3665e2b0-b313-4242-af04-45597829e681] Received event network-vif-unplugged-f853ffac-a897-4f2b-9131-4b4cc7ffdb18 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:32:08 compute-0 nova_compute[192698]: 2025-10-01 14:32:08.778 2 DEBUG oslo_concurrency.lockutils [req-617d9405-06a7-414b-b089-4997d2e07275 req-fa5c45b6-9dde-4c74-bc8c-84d1a9e0c45d 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "3665e2b0-b313-4242-af04-45597829e681-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:32:08 compute-0 nova_compute[192698]: 2025-10-01 14:32:08.778 2 DEBUG oslo_concurrency.lockutils [req-617d9405-06a7-414b-b089-4997d2e07275 req-fa5c45b6-9dde-4c74-bc8c-84d1a9e0c45d 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "3665e2b0-b313-4242-af04-45597829e681-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:32:08 compute-0 nova_compute[192698]: 2025-10-01 14:32:08.779 2 DEBUG oslo_concurrency.lockutils [req-617d9405-06a7-414b-b089-4997d2e07275 req-fa5c45b6-9dde-4c74-bc8c-84d1a9e0c45d 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "3665e2b0-b313-4242-af04-45597829e681-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:32:08 compute-0 nova_compute[192698]: 2025-10-01 14:32:08.779 2 DEBUG nova.compute.manager [req-617d9405-06a7-414b-b089-4997d2e07275 req-fa5c45b6-9dde-4c74-bc8c-84d1a9e0c45d 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 3665e2b0-b313-4242-af04-45597829e681] No waiting events found dispatching network-vif-unplugged-f853ffac-a897-4f2b-9131-4b4cc7ffdb18 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:32:08 compute-0 nova_compute[192698]: 2025-10-01 14:32:08.780 2 DEBUG nova.compute.manager [req-617d9405-06a7-414b-b089-4997d2e07275 req-fa5c45b6-9dde-4c74-bc8c-84d1a9e0c45d 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 3665e2b0-b313-4242-af04-45597829e681] Received event network-vif-unplugged-f853ffac-a897-4f2b-9131-4b4cc7ffdb18 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 01 14:32:08 compute-0 nova_compute[192698]: 2025-10-01 14:32:08.780 2 DEBUG nova.compute.manager [req-617d9405-06a7-414b-b089-4997d2e07275 req-fa5c45b6-9dde-4c74-bc8c-84d1a9e0c45d 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 3665e2b0-b313-4242-af04-45597829e681] Received event network-vif-deleted-f853ffac-a897-4f2b-9131-4b4cc7ffdb18 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:32:08 compute-0 nova_compute[192698]: 2025-10-01 14:32:08.918 2 INFO nova.compute.manager [-] [instance: 3665e2b0-b313-4242-af04-45597829e681] Took 1.51 seconds to deallocate network for instance.
Oct 01 14:32:09 compute-0 nova_compute[192698]: 2025-10-01 14:32:09.448 2 DEBUG oslo_concurrency.lockutils [None req-34463f94-b108-41d6-95fc-440dd964b668 b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:32:09 compute-0 nova_compute[192698]: 2025-10-01 14:32:09.449 2 DEBUG oslo_concurrency.lockutils [None req-34463f94-b108-41d6-95fc-440dd964b668 b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:32:09 compute-0 nova_compute[192698]: 2025-10-01 14:32:09.548 2 DEBUG nova.compute.provider_tree [None req-34463f94-b108-41d6-95fc-440dd964b668 b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:32:10 compute-0 unix_chkpwd[227668]: password check failed for user (root)
Oct 01 14:32:10 compute-0 sshd-session[227666]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=101.47.181.100  user=root
Oct 01 14:32:10 compute-0 nova_compute[192698]: 2025-10-01 14:32:10.012 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:32:10 compute-0 nova_compute[192698]: 2025-10-01 14:32:10.057 2 DEBUG nova.scheduler.client.report [None req-34463f94-b108-41d6-95fc-440dd964b668 b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:32:10 compute-0 podman[227669]: 2025-10-01 14:32:10.184831525 +0000 UTC m=+0.089016165 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., name=ubi9-minimal, version=9.6, container_name=openstack_network_exporter, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=edpm_ansible, release=1755695350, vcs-type=git, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7)
Oct 01 14:32:10 compute-0 nova_compute[192698]: 2025-10-01 14:32:10.571 2 DEBUG oslo_concurrency.lockutils [None req-34463f94-b108-41d6-95fc-440dd964b668 b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.122s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:32:10 compute-0 nova_compute[192698]: 2025-10-01 14:32:10.626 2 INFO nova.scheduler.client.report [None req-34463f94-b108-41d6-95fc-440dd964b668 b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Deleted allocations for instance 3665e2b0-b313-4242-af04-45597829e681
Oct 01 14:32:11 compute-0 nova_compute[192698]: 2025-10-01 14:32:11.662 2 DEBUG oslo_concurrency.lockutils [None req-34463f94-b108-41d6-95fc-440dd964b668 b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Lock "3665e2b0-b313-4242-af04-45597829e681" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.164s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:32:11 compute-0 sshd-session[227666]: Failed password for root from 101.47.181.100 port 35114 ssh2
Oct 01 14:32:11 compute-0 nova_compute[192698]: 2025-10-01 14:32:11.872 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:32:12 compute-0 nova_compute[192698]: 2025-10-01 14:32:12.366 2 DEBUG oslo_concurrency.lockutils [None req-19cb0a2d-9ad7-4dde-84d0-af7dbaf74a41 b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Acquiring lock "6a691fbf-e223-46a5-a8c8-914241ef1102" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:32:12 compute-0 nova_compute[192698]: 2025-10-01 14:32:12.366 2 DEBUG oslo_concurrency.lockutils [None req-19cb0a2d-9ad7-4dde-84d0-af7dbaf74a41 b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Lock "6a691fbf-e223-46a5-a8c8-914241ef1102" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:32:12 compute-0 nova_compute[192698]: 2025-10-01 14:32:12.367 2 DEBUG oslo_concurrency.lockutils [None req-19cb0a2d-9ad7-4dde-84d0-af7dbaf74a41 b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Acquiring lock "6a691fbf-e223-46a5-a8c8-914241ef1102-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:32:12 compute-0 nova_compute[192698]: 2025-10-01 14:32:12.367 2 DEBUG oslo_concurrency.lockutils [None req-19cb0a2d-9ad7-4dde-84d0-af7dbaf74a41 b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Lock "6a691fbf-e223-46a5-a8c8-914241ef1102-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:32:12 compute-0 nova_compute[192698]: 2025-10-01 14:32:12.367 2 DEBUG oslo_concurrency.lockutils [None req-19cb0a2d-9ad7-4dde-84d0-af7dbaf74a41 b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Lock "6a691fbf-e223-46a5-a8c8-914241ef1102-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:32:12 compute-0 nova_compute[192698]: 2025-10-01 14:32:12.383 2 INFO nova.compute.manager [None req-19cb0a2d-9ad7-4dde-84d0-af7dbaf74a41 b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] [instance: 6a691fbf-e223-46a5-a8c8-914241ef1102] Terminating instance
Oct 01 14:32:12 compute-0 nova_compute[192698]: 2025-10-01 14:32:12.900 2 DEBUG nova.compute.manager [None req-19cb0a2d-9ad7-4dde-84d0-af7dbaf74a41 b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] [instance: 6a691fbf-e223-46a5-a8c8-914241ef1102] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 01 14:32:12 compute-0 kernel: tapd5ac1caa-a0 (unregistering): left promiscuous mode
Oct 01 14:32:12 compute-0 NetworkManager[51741]: <info>  [1759329132.9280] device (tapd5ac1caa-a0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 01 14:32:12 compute-0 ovn_controller[94909]: 2025-10-01T14:32:12Z|00245|binding|INFO|Releasing lport d5ac1caa-a00c-489e-927e-5762f26c0b4c from this chassis (sb_readonly=0)
Oct 01 14:32:12 compute-0 nova_compute[192698]: 2025-10-01 14:32:12.942 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:32:12 compute-0 ovn_controller[94909]: 2025-10-01T14:32:12Z|00246|binding|INFO|Setting lport d5ac1caa-a00c-489e-927e-5762f26c0b4c down in Southbound
Oct 01 14:32:12 compute-0 ovn_controller[94909]: 2025-10-01T14:32:12Z|00247|binding|INFO|Removing iface tapd5ac1caa-a0 ovn-installed in OVS
Oct 01 14:32:12 compute-0 nova_compute[192698]: 2025-10-01 14:32:12.945 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:32:12 compute-0 sshd-session[227666]: Connection closed by authenticating user root 101.47.181.100 port 35114 [preauth]
Oct 01 14:32:12 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:32:12.953 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e0:63:05 10.100.0.3'], port_security=['fa:16:3e:e0:63:05 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '6a691fbf-e223-46a5-a8c8-914241ef1102', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ac2316c2-cb81-4558-9b5e-4a4794313854', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '80bb651087894631addd91dd6ce2ecd0', 'neutron:revision_number': '15', 'neutron:security_group_ids': '89f6228c-210a-405a-bc96-86ab494387a8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b9cd0b76-6dc9-458b-82d5-27e9ccc0503c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], logical_port=d5ac1caa-a00c-489e-927e-5762f26c0b4c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:32:12 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:32:12.956 103791 INFO neutron.agent.ovn.metadata.agent [-] Port d5ac1caa-a00c-489e-927e-5762f26c0b4c in datapath ac2316c2-cb81-4558-9b5e-4a4794313854 unbound from our chassis
Oct 01 14:32:12 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:32:12.958 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ac2316c2-cb81-4558-9b5e-4a4794313854, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 01 14:32:12 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:32:12.959 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[04d48137-1876-40e3-808b-22dd9ef6466c]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:32:12 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:32:12.960 103791 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ac2316c2-cb81-4558-9b5e-4a4794313854 namespace which is not needed anymore
Oct 01 14:32:12 compute-0 nova_compute[192698]: 2025-10-01 14:32:12.968 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:32:13 compute-0 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d0000001c.scope: Deactivated successfully.
Oct 01 14:32:13 compute-0 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d0000001c.scope: Consumed 3.272s CPU time.
Oct 01 14:32:13 compute-0 systemd-machined[152704]: Machine qemu-23-instance-0000001c terminated.
Oct 01 14:32:13 compute-0 nova_compute[192698]: 2025-10-01 14:32:13.117 2 DEBUG nova.compute.manager [req-24412bb0-044b-4bec-8786-1e062180065c req-3c2fa25d-ff1a-4c9c-9d43-7e1edf9be1f0 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 6a691fbf-e223-46a5-a8c8-914241ef1102] Received event network-vif-unplugged-d5ac1caa-a00c-489e-927e-5762f26c0b4c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:32:13 compute-0 nova_compute[192698]: 2025-10-01 14:32:13.117 2 DEBUG oslo_concurrency.lockutils [req-24412bb0-044b-4bec-8786-1e062180065c req-3c2fa25d-ff1a-4c9c-9d43-7e1edf9be1f0 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "6a691fbf-e223-46a5-a8c8-914241ef1102-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:32:13 compute-0 nova_compute[192698]: 2025-10-01 14:32:13.117 2 DEBUG oslo_concurrency.lockutils [req-24412bb0-044b-4bec-8786-1e062180065c req-3c2fa25d-ff1a-4c9c-9d43-7e1edf9be1f0 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "6a691fbf-e223-46a5-a8c8-914241ef1102-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:32:13 compute-0 nova_compute[192698]: 2025-10-01 14:32:13.117 2 DEBUG oslo_concurrency.lockutils [req-24412bb0-044b-4bec-8786-1e062180065c req-3c2fa25d-ff1a-4c9c-9d43-7e1edf9be1f0 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "6a691fbf-e223-46a5-a8c8-914241ef1102-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:32:13 compute-0 nova_compute[192698]: 2025-10-01 14:32:13.118 2 DEBUG nova.compute.manager [req-24412bb0-044b-4bec-8786-1e062180065c req-3c2fa25d-ff1a-4c9c-9d43-7e1edf9be1f0 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 6a691fbf-e223-46a5-a8c8-914241ef1102] No waiting events found dispatching network-vif-unplugged-d5ac1caa-a00c-489e-927e-5762f26c0b4c pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:32:13 compute-0 nova_compute[192698]: 2025-10-01 14:32:13.118 2 DEBUG nova.compute.manager [req-24412bb0-044b-4bec-8786-1e062180065c req-3c2fa25d-ff1a-4c9c-9d43-7e1edf9be1f0 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 6a691fbf-e223-46a5-a8c8-914241ef1102] Received event network-vif-unplugged-d5ac1caa-a00c-489e-927e-5762f26c0b4c for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 01 14:32:13 compute-0 neutron-haproxy-ovnmeta-ac2316c2-cb81-4558-9b5e-4a4794313854[227187]: [NOTICE]   (227217) : haproxy version is 3.0.5-8e879a5
Oct 01 14:32:13 compute-0 podman[227717]: 2025-10-01 14:32:13.126201207 +0000 UTC m=+0.039983437 container kill 9944d83b04be09e3d751008fcfbab5cc96a0d23cbf20af1fcae3ab1d5b93b5e3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ac2316c2-cb81-4558-9b5e-4a4794313854, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Oct 01 14:32:13 compute-0 neutron-haproxy-ovnmeta-ac2316c2-cb81-4558-9b5e-4a4794313854[227187]: [NOTICE]   (227217) : path to executable is /usr/sbin/haproxy
Oct 01 14:32:13 compute-0 neutron-haproxy-ovnmeta-ac2316c2-cb81-4558-9b5e-4a4794313854[227187]: [WARNING]  (227217) : Exiting Master process...
Oct 01 14:32:13 compute-0 nova_compute[192698]: 2025-10-01 14:32:13.128 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:32:13 compute-0 neutron-haproxy-ovnmeta-ac2316c2-cb81-4558-9b5e-4a4794313854[227187]: [ALERT]    (227217) : Current worker (227229) exited with code 143 (Terminated)
Oct 01 14:32:13 compute-0 neutron-haproxy-ovnmeta-ac2316c2-cb81-4558-9b5e-4a4794313854[227187]: [WARNING]  (227217) : All workers exited. Exiting... (0)
Oct 01 14:32:13 compute-0 systemd[1]: libpod-9944d83b04be09e3d751008fcfbab5cc96a0d23cbf20af1fcae3ab1d5b93b5e3.scope: Deactivated successfully.
Oct 01 14:32:13 compute-0 nova_compute[192698]: 2025-10-01 14:32:13.133 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:32:13 compute-0 podman[227737]: 2025-10-01 14:32:13.179807849 +0000 UTC m=+0.026884335 container died 9944d83b04be09e3d751008fcfbab5cc96a0d23cbf20af1fcae3ab1d5b93b5e3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ac2316c2-cb81-4558-9b5e-4a4794313854, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 01 14:32:13 compute-0 nova_compute[192698]: 2025-10-01 14:32:13.180 2 INFO nova.virt.libvirt.driver [-] [instance: 6a691fbf-e223-46a5-a8c8-914241ef1102] Instance destroyed successfully.
Oct 01 14:32:13 compute-0 nova_compute[192698]: 2025-10-01 14:32:13.181 2 DEBUG nova.objects.instance [None req-19cb0a2d-9ad7-4dde-84d0-af7dbaf74a41 b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Lazy-loading 'resources' on Instance uuid 6a691fbf-e223-46a5-a8c8-914241ef1102 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:32:13 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9944d83b04be09e3d751008fcfbab5cc96a0d23cbf20af1fcae3ab1d5b93b5e3-userdata-shm.mount: Deactivated successfully.
Oct 01 14:32:13 compute-0 systemd[1]: var-lib-containers-storage-overlay-2a1c0f6836b9e3c0c4f3a2f50b85afeef6ea5a990508bbe9d6549f1f75ae0914-merged.mount: Deactivated successfully.
Oct 01 14:32:13 compute-0 podman[227737]: 2025-10-01 14:32:13.219897397 +0000 UTC m=+0.066973863 container cleanup 9944d83b04be09e3d751008fcfbab5cc96a0d23cbf20af1fcae3ab1d5b93b5e3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ac2316c2-cb81-4558-9b5e-4a4794313854, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Oct 01 14:32:13 compute-0 systemd[1]: libpod-conmon-9944d83b04be09e3d751008fcfbab5cc96a0d23cbf20af1fcae3ab1d5b93b5e3.scope: Deactivated successfully.
Oct 01 14:32:13 compute-0 podman[227745]: 2025-10-01 14:32:13.242633259 +0000 UTC m=+0.081817302 container remove 9944d83b04be09e3d751008fcfbab5cc96a0d23cbf20af1fcae3ab1d5b93b5e3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-ac2316c2-cb81-4558-9b5e-4a4794313854, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Oct 01 14:32:13 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:32:13.252 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[045c3027-6f68-48bf-b509-732ae5e6b759]: (4, ("Wed Oct  1 02:32:13 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-ac2316c2-cb81-4558-9b5e-4a4794313854 (9944d83b04be09e3d751008fcfbab5cc96a0d23cbf20af1fcae3ab1d5b93b5e3)\n9944d83b04be09e3d751008fcfbab5cc96a0d23cbf20af1fcae3ab1d5b93b5e3\nWed Oct  1 02:32:13 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ac2316c2-cb81-4558-9b5e-4a4794313854 (9944d83b04be09e3d751008fcfbab5cc96a0d23cbf20af1fcae3ab1d5b93b5e3)\n9944d83b04be09e3d751008fcfbab5cc96a0d23cbf20af1fcae3ab1d5b93b5e3\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:32:13 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:32:13.254 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[7c15c251-b701-4a38-bc4a-bcf13ac4950e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:32:13 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:32:13.254 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ac2316c2-cb81-4558-9b5e-4a4794313854.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ac2316c2-cb81-4558-9b5e-4a4794313854.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:32:13 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:32:13.255 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[e5e825bd-ea1c-43a5-8aa8-6918753e39b9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:32:13 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:32:13.255 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapac2316c2-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:32:13 compute-0 nova_compute[192698]: 2025-10-01 14:32:13.302 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:32:13 compute-0 kernel: tapac2316c2-c0: left promiscuous mode
Oct 01 14:32:13 compute-0 nova_compute[192698]: 2025-10-01 14:32:13.320 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:32:13 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:32:13.323 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[ce9be6f5-7e42-47e8-98b0-ebc5f8550f30]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:32:13 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:32:13.366 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[eebb84f5-ce7a-429e-b737-96bc1fd3d231]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:32:13 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:32:13.368 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[4c54e433-3cd5-470e-b246-57008bea7a27]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:32:13 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:32:13.385 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[e65ebe8e-900b-4e04-87bc-cc0543926935]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 542037, 'reachable_time': 36903, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227784, 'error': None, 'target': 'ovnmeta-ac2316c2-cb81-4558-9b5e-4a4794313854', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:32:13 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:32:13.390 103910 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ac2316c2-cb81-4558-9b5e-4a4794313854 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Oct 01 14:32:13 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:32:13.390 103910 DEBUG oslo.privsep.daemon [-] privsep: reply[15509bd1-5328-4f36-9ad7-5b71326651e9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:32:13 compute-0 systemd[1]: run-netns-ovnmeta\x2dac2316c2\x2dcb81\x2d4558\x2d9b5e\x2d4a4794313854.mount: Deactivated successfully.
Oct 01 14:32:13 compute-0 nova_compute[192698]: 2025-10-01 14:32:13.689 2 DEBUG nova.virt.libvirt.vif [None req-19cb0a2d-9ad7-4dde-84d0-af7dbaf74a41 b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-10-01T14:30:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-548156839',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-548156839',id=28,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-01T14:30:41Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='80bb651087894631addd91dd6ce2ecd0',ramdisk_id='',reservation_id='r-fj040vhh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',clean_attempts='1',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1927341926',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1927341926-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-01T14:32:02Z,user_data=None,user_id='b71a58b28129460f94de238eedc8965c',uuid=6a691fbf-e223-46a5-a8c8-914241ef1102,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d5ac1caa-a00c-489e-927e-5762f26c0b4c", "address": "fa:16:3e:e0:63:05", "network": {"id": "ac2316c2-cb81-4558-9b5e-4a4794313854", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1793282763-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e30e83299c1e445dbba9473590367e5b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd5ac1caa-a0", "ovs_interfaceid": "d5ac1caa-a00c-489e-927e-5762f26c0b4c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 01 14:32:13 compute-0 nova_compute[192698]: 2025-10-01 14:32:13.690 2 DEBUG nova.network.os_vif_util [None req-19cb0a2d-9ad7-4dde-84d0-af7dbaf74a41 b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Converting VIF {"id": "d5ac1caa-a00c-489e-927e-5762f26c0b4c", "address": "fa:16:3e:e0:63:05", "network": {"id": "ac2316c2-cb81-4558-9b5e-4a4794313854", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1793282763-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e30e83299c1e445dbba9473590367e5b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd5ac1caa-a0", "ovs_interfaceid": "d5ac1caa-a00c-489e-927e-5762f26c0b4c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:32:13 compute-0 nova_compute[192698]: 2025-10-01 14:32:13.690 2 DEBUG nova.network.os_vif_util [None req-19cb0a2d-9ad7-4dde-84d0-af7dbaf74a41 b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e0:63:05,bridge_name='br-int',has_traffic_filtering=True,id=d5ac1caa-a00c-489e-927e-5762f26c0b4c,network=Network(ac2316c2-cb81-4558-9b5e-4a4794313854),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd5ac1caa-a0') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:32:13 compute-0 nova_compute[192698]: 2025-10-01 14:32:13.691 2 DEBUG os_vif [None req-19cb0a2d-9ad7-4dde-84d0-af7dbaf74a41 b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e0:63:05,bridge_name='br-int',has_traffic_filtering=True,id=d5ac1caa-a00c-489e-927e-5762f26c0b4c,network=Network(ac2316c2-cb81-4558-9b5e-4a4794313854),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd5ac1caa-a0') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 01 14:32:13 compute-0 nova_compute[192698]: 2025-10-01 14:32:13.692 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:32:13 compute-0 nova_compute[192698]: 2025-10-01 14:32:13.693 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd5ac1caa-a0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:32:13 compute-0 nova_compute[192698]: 2025-10-01 14:32:13.694 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:32:13 compute-0 nova_compute[192698]: 2025-10-01 14:32:13.696 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:32:13 compute-0 nova_compute[192698]: 2025-10-01 14:32:13.698 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:32:13 compute-0 nova_compute[192698]: 2025-10-01 14:32:13.698 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=4e7266fd-c424-47e1-9bfc-989c7f280740) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:32:13 compute-0 nova_compute[192698]: 2025-10-01 14:32:13.699 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:32:13 compute-0 nova_compute[192698]: 2025-10-01 14:32:13.700 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:32:13 compute-0 nova_compute[192698]: 2025-10-01 14:32:13.702 2 INFO os_vif [None req-19cb0a2d-9ad7-4dde-84d0-af7dbaf74a41 b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e0:63:05,bridge_name='br-int',has_traffic_filtering=True,id=d5ac1caa-a00c-489e-927e-5762f26c0b4c,network=Network(ac2316c2-cb81-4558-9b5e-4a4794313854),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd5ac1caa-a0')
Oct 01 14:32:13 compute-0 nova_compute[192698]: 2025-10-01 14:32:13.703 2 INFO nova.virt.libvirt.driver [None req-19cb0a2d-9ad7-4dde-84d0-af7dbaf74a41 b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] [instance: 6a691fbf-e223-46a5-a8c8-914241ef1102] Deleting instance files /var/lib/nova/instances/6a691fbf-e223-46a5-a8c8-914241ef1102_del
Oct 01 14:32:13 compute-0 nova_compute[192698]: 2025-10-01 14:32:13.704 2 INFO nova.virt.libvirt.driver [None req-19cb0a2d-9ad7-4dde-84d0-af7dbaf74a41 b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] [instance: 6a691fbf-e223-46a5-a8c8-914241ef1102] Deletion of /var/lib/nova/instances/6a691fbf-e223-46a5-a8c8-914241ef1102_del complete
Oct 01 14:32:14 compute-0 nova_compute[192698]: 2025-10-01 14:32:14.217 2 INFO nova.compute.manager [None req-19cb0a2d-9ad7-4dde-84d0-af7dbaf74a41 b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] [instance: 6a691fbf-e223-46a5-a8c8-914241ef1102] Took 1.32 seconds to destroy the instance on the hypervisor.
Oct 01 14:32:14 compute-0 nova_compute[192698]: 2025-10-01 14:32:14.218 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-19cb0a2d-9ad7-4dde-84d0-af7dbaf74a41 b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 01 14:32:14 compute-0 nova_compute[192698]: 2025-10-01 14:32:14.218 2 DEBUG nova.compute.manager [-] [instance: 6a691fbf-e223-46a5-a8c8-914241ef1102] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 01 14:32:14 compute-0 nova_compute[192698]: 2025-10-01 14:32:14.219 2 DEBUG nova.network.neutron [-] [instance: 6a691fbf-e223-46a5-a8c8-914241ef1102] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 01 14:32:14 compute-0 nova_compute[192698]: 2025-10-01 14:32:14.219 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:32:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:32:14.298 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:32:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:32:14.298 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:32:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:32:14.299 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:32:14 compute-0 nova_compute[192698]: 2025-10-01 14:32:14.577 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:32:14 compute-0 unix_chkpwd[227786]: password check failed for user (root)
Oct 01 14:32:14 compute-0 sshd-session[227775]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=101.47.181.100  user=root
Oct 01 14:32:15 compute-0 nova_compute[192698]: 2025-10-01 14:32:15.016 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:32:15 compute-0 nova_compute[192698]: 2025-10-01 14:32:15.216 2 DEBUG nova.compute.manager [req-48f16f00-544e-49c7-869f-bc7a8c645a57 req-a0517dc2-a902-492e-b196-eb87d41cf833 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 6a691fbf-e223-46a5-a8c8-914241ef1102] Received event network-vif-unplugged-d5ac1caa-a00c-489e-927e-5762f26c0b4c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:32:15 compute-0 nova_compute[192698]: 2025-10-01 14:32:15.217 2 DEBUG oslo_concurrency.lockutils [req-48f16f00-544e-49c7-869f-bc7a8c645a57 req-a0517dc2-a902-492e-b196-eb87d41cf833 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "6a691fbf-e223-46a5-a8c8-914241ef1102-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:32:15 compute-0 nova_compute[192698]: 2025-10-01 14:32:15.217 2 DEBUG oslo_concurrency.lockutils [req-48f16f00-544e-49c7-869f-bc7a8c645a57 req-a0517dc2-a902-492e-b196-eb87d41cf833 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "6a691fbf-e223-46a5-a8c8-914241ef1102-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:32:15 compute-0 nova_compute[192698]: 2025-10-01 14:32:15.218 2 DEBUG oslo_concurrency.lockutils [req-48f16f00-544e-49c7-869f-bc7a8c645a57 req-a0517dc2-a902-492e-b196-eb87d41cf833 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "6a691fbf-e223-46a5-a8c8-914241ef1102-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:32:15 compute-0 nova_compute[192698]: 2025-10-01 14:32:15.218 2 DEBUG nova.compute.manager [req-48f16f00-544e-49c7-869f-bc7a8c645a57 req-a0517dc2-a902-492e-b196-eb87d41cf833 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 6a691fbf-e223-46a5-a8c8-914241ef1102] No waiting events found dispatching network-vif-unplugged-d5ac1caa-a00c-489e-927e-5762f26c0b4c pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:32:15 compute-0 nova_compute[192698]: 2025-10-01 14:32:15.218 2 DEBUG nova.compute.manager [req-48f16f00-544e-49c7-869f-bc7a8c645a57 req-a0517dc2-a902-492e-b196-eb87d41cf833 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 6a691fbf-e223-46a5-a8c8-914241ef1102] Received event network-vif-unplugged-d5ac1caa-a00c-489e-927e-5762f26c0b4c for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 01 14:32:15 compute-0 nova_compute[192698]: 2025-10-01 14:32:15.219 2 DEBUG nova.compute.manager [req-48f16f00-544e-49c7-869f-bc7a8c645a57 req-a0517dc2-a902-492e-b196-eb87d41cf833 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 6a691fbf-e223-46a5-a8c8-914241ef1102] Received event network-vif-deleted-d5ac1caa-a00c-489e-927e-5762f26c0b4c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:32:15 compute-0 nova_compute[192698]: 2025-10-01 14:32:15.219 2 INFO nova.compute.manager [req-48f16f00-544e-49c7-869f-bc7a8c645a57 req-a0517dc2-a902-492e-b196-eb87d41cf833 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 6a691fbf-e223-46a5-a8c8-914241ef1102] Neutron deleted interface d5ac1caa-a00c-489e-927e-5762f26c0b4c; detaching it from the instance and deleting it from the info cache
Oct 01 14:32:15 compute-0 nova_compute[192698]: 2025-10-01 14:32:15.220 2 DEBUG nova.network.neutron [req-48f16f00-544e-49c7-869f-bc7a8c645a57 req-a0517dc2-a902-492e-b196-eb87d41cf833 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 6a691fbf-e223-46a5-a8c8-914241ef1102] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:32:15 compute-0 nova_compute[192698]: 2025-10-01 14:32:15.322 2 DEBUG nova.network.neutron [-] [instance: 6a691fbf-e223-46a5-a8c8-914241ef1102] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:32:15 compute-0 podman[227787]: 2025-10-01 14:32:15.381318775 +0000 UTC m=+0.106243659 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Oct 01 14:32:15 compute-0 podman[227788]: 2025-10-01 14:32:15.395068715 +0000 UTC m=+0.122890367 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2)
Oct 01 14:32:15 compute-0 nova_compute[192698]: 2025-10-01 14:32:15.729 2 DEBUG nova.compute.manager [req-48f16f00-544e-49c7-869f-bc7a8c645a57 req-a0517dc2-a902-492e-b196-eb87d41cf833 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 6a691fbf-e223-46a5-a8c8-914241ef1102] Detach interface failed, port_id=d5ac1caa-a00c-489e-927e-5762f26c0b4c, reason: Instance 6a691fbf-e223-46a5-a8c8-914241ef1102 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Oct 01 14:32:15 compute-0 nova_compute[192698]: 2025-10-01 14:32:15.831 2 INFO nova.compute.manager [-] [instance: 6a691fbf-e223-46a5-a8c8-914241ef1102] Took 1.61 seconds to deallocate network for instance.
Oct 01 14:32:16 compute-0 nova_compute[192698]: 2025-10-01 14:32:16.355 2 DEBUG oslo_concurrency.lockutils [None req-19cb0a2d-9ad7-4dde-84d0-af7dbaf74a41 b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:32:16 compute-0 nova_compute[192698]: 2025-10-01 14:32:16.356 2 DEBUG oslo_concurrency.lockutils [None req-19cb0a2d-9ad7-4dde-84d0-af7dbaf74a41 b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:32:16 compute-0 nova_compute[192698]: 2025-10-01 14:32:16.364 2 DEBUG oslo_concurrency.lockutils [None req-19cb0a2d-9ad7-4dde-84d0-af7dbaf74a41 b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.008s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:32:16 compute-0 nova_compute[192698]: 2025-10-01 14:32:16.391 2 INFO nova.scheduler.client.report [None req-19cb0a2d-9ad7-4dde-84d0-af7dbaf74a41 b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Deleted allocations for instance 6a691fbf-e223-46a5-a8c8-914241ef1102
Oct 01 14:32:17 compute-0 nova_compute[192698]: 2025-10-01 14:32:17.423 2 DEBUG oslo_concurrency.lockutils [None req-19cb0a2d-9ad7-4dde-84d0-af7dbaf74a41 b71a58b28129460f94de238eedc8965c 80bb651087894631addd91dd6ce2ecd0 - - default default] Lock "6a691fbf-e223-46a5-a8c8-914241ef1102" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.056s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:32:17 compute-0 sshd-session[227775]: Failed password for root from 101.47.181.100 port 35130 ssh2
Oct 01 14:32:18 compute-0 nova_compute[192698]: 2025-10-01 14:32:18.700 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:32:20 compute-0 nova_compute[192698]: 2025-10-01 14:32:20.017 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:32:21 compute-0 podman[227827]: 2025-10-01 14:32:21.144861399 +0000 UTC m=+0.055901535 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 01 14:32:21 compute-0 sshd-session[227775]: Connection closed by authenticating user root 101.47.181.100 port 35130 [preauth]
Oct 01 14:32:23 compute-0 nova_compute[192698]: 2025-10-01 14:32:23.703 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:32:24 compute-0 nova_compute[192698]: 2025-10-01 14:32:24.999 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:32:25 compute-0 nova_compute[192698]: 2025-10-01 14:32:25.018 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:32:28 compute-0 unix_chkpwd[227854]: password check failed for user (root)
Oct 01 14:32:28 compute-0 sshd-session[227852]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=101.47.181.100  user=root
Oct 01 14:32:28 compute-0 nova_compute[192698]: 2025-10-01 14:32:28.705 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:32:29 compute-0 sshd-session[227852]: Failed password for root from 101.47.181.100 port 51578 ssh2
Oct 01 14:32:29 compute-0 podman[203144]: time="2025-10-01T14:32:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:32:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:32:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:32:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:32:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3023 "" "Go-http-client/1.1"
Oct 01 14:32:30 compute-0 nova_compute[192698]: 2025-10-01 14:32:30.020 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:32:30 compute-0 nova_compute[192698]: 2025-10-01 14:32:30.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:32:30 compute-0 nova_compute[192698]: 2025-10-01 14:32:30.926 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:32:31 compute-0 sshd-session[227852]: Connection closed by authenticating user root 101.47.181.100 port 51578 [preauth]
Oct 01 14:32:31 compute-0 openstack_network_exporter[205307]: ERROR   14:32:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:32:31 compute-0 openstack_network_exporter[205307]: ERROR   14:32:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:32:31 compute-0 openstack_network_exporter[205307]: ERROR   14:32:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:32:31 compute-0 openstack_network_exporter[205307]: ERROR   14:32:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:32:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:32:31 compute-0 openstack_network_exporter[205307]: ERROR   14:32:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:32:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:32:31 compute-0 nova_compute[192698]: 2025-10-01 14:32:31.488 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:32:31 compute-0 nova_compute[192698]: 2025-10-01 14:32:31.489 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:32:31 compute-0 nova_compute[192698]: 2025-10-01 14:32:31.490 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:32:31 compute-0 nova_compute[192698]: 2025-10-01 14:32:31.490 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 01 14:32:31 compute-0 nova_compute[192698]: 2025-10-01 14:32:31.698 2 WARNING nova.virt.libvirt.driver [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:32:31 compute-0 nova_compute[192698]: 2025-10-01 14:32:31.699 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:32:31 compute-0 nova_compute[192698]: 2025-10-01 14:32:31.722 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.022s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:32:31 compute-0 nova_compute[192698]: 2025-10-01 14:32:31.723 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5840MB free_disk=73.30199813842773GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 01 14:32:31 compute-0 nova_compute[192698]: 2025-10-01 14:32:31.723 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:32:31 compute-0 nova_compute[192698]: 2025-10-01 14:32:31.723 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:32:32 compute-0 nova_compute[192698]: 2025-10-01 14:32:32.781 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 01 14:32:32 compute-0 nova_compute[192698]: 2025-10-01 14:32:32.781 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:32:31 up  1:31,  0 user,  load average: 0.13, 0.18, 0.25\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 01 14:32:32 compute-0 nova_compute[192698]: 2025-10-01 14:32:32.830 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:32:33 compute-0 podman[227859]: 2025-10-01 14:32:33.159285779 +0000 UTC m=+0.067687792 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0)
Oct 01 14:32:33 compute-0 podman[227860]: 2025-10-01 14:32:33.211917285 +0000 UTC m=+0.114385079 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Oct 01 14:32:33 compute-0 nova_compute[192698]: 2025-10-01 14:32:33.338 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:32:33 compute-0 nova_compute[192698]: 2025-10-01 14:32:33.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:32:33 compute-0 nova_compute[192698]: 2025-10-01 14:32:33.856 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 01 14:32:33 compute-0 nova_compute[192698]: 2025-10-01 14:32:33.856 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.133s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:32:34 compute-0 nova_compute[192698]: 2025-10-01 14:32:34.856 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:32:34 compute-0 nova_compute[192698]: 2025-10-01 14:32:34.856 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:32:34 compute-0 nova_compute[192698]: 2025-10-01 14:32:34.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:32:34 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:32:34.969 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f5:d1:eb 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-555191a0-aa04-49d4-af46-93b0ff584e2d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-555191a0-aa04-49d4-af46-93b0ff584e2d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c0ce6811a65d40628bfc69d5eb9bcf01', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3857fb1d-1e54-4a99-a828-506b6bdd5885, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=0a30025f-995f-4cd9-a2bb-85c8c480de92) old=Port_Binding(mac=['fa:16:3e:f5:d1:eb'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-555191a0-aa04-49d4-af46-93b0ff584e2d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-555191a0-aa04-49d4-af46-93b0ff584e2d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c0ce6811a65d40628bfc69d5eb9bcf01', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:32:34 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:32:34.970 103791 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 0a30025f-995f-4cd9-a2bb-85c8c480de92 in datapath 555191a0-aa04-49d4-af46-93b0ff584e2d updated
Oct 01 14:32:34 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:32:34.971 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 555191a0-aa04-49d4-af46-93b0ff584e2d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 01 14:32:34 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:32:34.973 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[58dba68e-410c-4b3d-8b7e-8934e29cb0f7]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:32:35 compute-0 nova_compute[192698]: 2025-10-01 14:32:35.021 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:32:36 compute-0 unix_chkpwd[227903]: password check failed for user (root)
Oct 01 14:32:36 compute-0 sshd-session[227857]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=101.47.181.100  user=root
Oct 01 14:32:38 compute-0 sshd-session[227857]: Failed password for root from 101.47.181.100 port 53360 ssh2
Oct 01 14:32:38 compute-0 nova_compute[192698]: 2025-10-01 14:32:38.752 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:32:38 compute-0 nova_compute[192698]: 2025-10-01 14:32:38.913 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:32:39 compute-0 sshd-session[227857]: Connection closed by authenticating user root 101.47.181.100 port 53360 [preauth]
Oct 01 14:32:39 compute-0 nova_compute[192698]: 2025-10-01 14:32:39.924 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:32:39 compute-0 nova_compute[192698]: 2025-10-01 14:32:39.925 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 01 14:32:40 compute-0 nova_compute[192698]: 2025-10-01 14:32:40.024 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:32:40 compute-0 nova_compute[192698]: 2025-10-01 14:32:40.926 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:32:41 compute-0 podman[227906]: 2025-10-01 14:32:41.179564326 +0000 UTC m=+0.083161679 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, name=ubi9-minimal, managed_by=edpm_ansible, architecture=x86_64, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.buildah.version=1.33.7, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter)
Oct 01 14:32:41 compute-0 unix_chkpwd[227927]: password check failed for user (root)
Oct 01 14:32:41 compute-0 sshd-session[227904]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=101.47.181.100  user=root
Oct 01 14:32:43 compute-0 sshd-session[227904]: Failed password for root from 101.47.181.100 port 52836 ssh2
Oct 01 14:32:43 compute-0 nova_compute[192698]: 2025-10-01 14:32:43.755 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:32:44 compute-0 sshd-session[227904]: Connection closed by authenticating user root 101.47.181.100 port 52836 [preauth]
Oct 01 14:32:44 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:32:44.713 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f6:26:2f 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-fb18d4bb-2d35-46f3-a890-dde7b84f4d3c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fb18d4bb-2d35-46f3-a890-dde7b84f4d3c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '212993276c39412c938b179b82d692f2', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=18865f90-580a-4e72-a6f4-e733eed35cec, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=b4db1400-e10a-49f9-b365-95bb6bbc3623) old=Port_Binding(mac=['fa:16:3e:f6:26:2f'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-fb18d4bb-2d35-46f3-a890-dde7b84f4d3c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fb18d4bb-2d35-46f3-a890-dde7b84f4d3c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '212993276c39412c938b179b82d692f2', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:32:44 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:32:44.715 103791 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port b4db1400-e10a-49f9-b365-95bb6bbc3623 in datapath fb18d4bb-2d35-46f3-a890-dde7b84f4d3c updated
Oct 01 14:32:44 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:32:44.716 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fb18d4bb-2d35-46f3-a890-dde7b84f4d3c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 01 14:32:44 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:32:44.717 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[d2a97262-539e-4cbb-9f79-534927e99edc]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:32:45 compute-0 nova_compute[192698]: 2025-10-01 14:32:45.025 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:32:46 compute-0 podman[227931]: 2025-10-01 14:32:46.159604381 +0000 UTC m=+0.081821702 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=iscsid, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Oct 01 14:32:46 compute-0 podman[227932]: 2025-10-01 14:32:46.166736163 +0000 UTC m=+0.079661164 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct 01 14:32:46 compute-0 unix_chkpwd[227972]: password check failed for user (root)
Oct 01 14:32:46 compute-0 sshd-session[227929]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=101.47.181.100  user=root
Oct 01 14:32:48 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:32:48.298 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=30, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'e2:3f:3c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:1d:a6:67:ed:e6'}, ipsec=False) old=SB_Global(nb_cfg=29) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:32:48 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:32:48.299 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 01 14:32:48 compute-0 nova_compute[192698]: 2025-10-01 14:32:48.299 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:32:48 compute-0 nova_compute[192698]: 2025-10-01 14:32:48.758 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:32:48 compute-0 sshd-session[227929]: Failed password for root from 101.47.181.100 port 52848 ssh2
Oct 01 14:32:49 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:32:49.301 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=10cf9814-09fa-4bad-879a-270f9b64eda3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '30'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:32:49 compute-0 sshd-session[227929]: Connection closed by authenticating user root 101.47.181.100 port 52848 [preauth]
Oct 01 14:32:50 compute-0 nova_compute[192698]: 2025-10-01 14:32:50.027 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:32:52 compute-0 podman[227976]: 2025-10-01 14:32:52.146029392 +0000 UTC m=+0.066340746 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 01 14:32:53 compute-0 nova_compute[192698]: 2025-10-01 14:32:53.760 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:32:55 compute-0 nova_compute[192698]: 2025-10-01 14:32:55.029 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:32:58 compute-0 ovn_controller[94909]: 2025-10-01T14:32:58Z|00248|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Oct 01 14:32:58 compute-0 unix_chkpwd[228000]: password check failed for user (root)
Oct 01 14:32:58 compute-0 sshd-session[227974]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=101.47.181.100  user=root
Oct 01 14:32:58 compute-0 nova_compute[192698]: 2025-10-01 14:32:58.761 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:32:58 compute-0 nova_compute[192698]: 2025-10-01 14:32:58.926 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:32:58 compute-0 nova_compute[192698]: 2025-10-01 14:32:58.927 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:32:58 compute-0 nova_compute[192698]: 2025-10-01 14:32:58.928 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:32:58 compute-0 nova_compute[192698]: 2025-10-01 14:32:58.928 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:32:58 compute-0 nova_compute[192698]: 2025-10-01 14:32:58.928 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:32:58 compute-0 nova_compute[192698]: 2025-10-01 14:32:58.929 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:32:58 compute-0 nova_compute[192698]: 2025-10-01 14:32:58.929 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:32:59 compute-0 podman[203144]: time="2025-10-01T14:32:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:32:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:32:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:32:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:32:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3030 "" "Go-http-client/1.1"
Oct 01 14:32:59 compute-0 nova_compute[192698]: 2025-10-01 14:32:59.944 2 DEBUG nova.virt.libvirt.imagecache [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:314
Oct 01 14:32:59 compute-0 nova_compute[192698]: 2025-10-01 14:32:59.945 2 WARNING nova.virt.libvirt.imagecache [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Unknown base file: /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546
Oct 01 14:32:59 compute-0 nova_compute[192698]: 2025-10-01 14:32:59.946 2 INFO nova.virt.libvirt.imagecache [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Removable base files: /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546
Oct 01 14:32:59 compute-0 nova_compute[192698]: 2025-10-01 14:32:59.947 2 INFO nova.virt.libvirt.imagecache [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546
Oct 01 14:32:59 compute-0 nova_compute[192698]: 2025-10-01 14:32:59.947 2 DEBUG nova.virt.libvirt.imagecache [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:350
Oct 01 14:32:59 compute-0 nova_compute[192698]: 2025-10-01 14:32:59.948 2 DEBUG nova.virt.libvirt.imagecache [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:299
Oct 01 14:32:59 compute-0 nova_compute[192698]: 2025-10-01 14:32:59.948 2 DEBUG nova.virt.libvirt.imagecache [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:284
Oct 01 14:33:00 compute-0 nova_compute[192698]: 2025-10-01 14:33:00.031 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:33:00 compute-0 sshd-session[227974]: Failed password for root from 101.47.181.100 port 35370 ssh2
Oct 01 14:33:01 compute-0 sshd-session[227974]: Connection closed by authenticating user root 101.47.181.100 port 35370 [preauth]
Oct 01 14:33:01 compute-0 openstack_network_exporter[205307]: ERROR   14:33:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:33:01 compute-0 openstack_network_exporter[205307]: ERROR   14:33:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:33:01 compute-0 openstack_network_exporter[205307]: ERROR   14:33:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:33:01 compute-0 openstack_network_exporter[205307]: ERROR   14:33:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:33:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:33:01 compute-0 openstack_network_exporter[205307]: ERROR   14:33:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:33:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:33:03 compute-0 nova_compute[192698]: 2025-10-01 14:33:03.763 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:33:04 compute-0 podman[228003]: 2025-10-01 14:33:04.180481178 +0000 UTC m=+0.091701478 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20250930, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 01 14:33:04 compute-0 podman[228004]: 2025-10-01 14:33:04.223063044 +0000 UTC m=+0.126707530 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.4, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 01 14:33:04 compute-0 unix_chkpwd[228048]: password check failed for user (root)
Oct 01 14:33:04 compute-0 sshd-session[228001]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=101.47.181.100  user=root
Oct 01 14:33:05 compute-0 nova_compute[192698]: 2025-10-01 14:33:05.034 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:33:05 compute-0 sshd-session[228001]: Failed password for root from 101.47.181.100 port 58720 ssh2
Oct 01 14:33:08 compute-0 nova_compute[192698]: 2025-10-01 14:33:08.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:33:10 compute-0 nova_compute[192698]: 2025-10-01 14:33:10.036 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:33:11 compute-0 unix_chkpwd[228051]: password check failed for user (root)
Oct 01 14:33:11 compute-0 sshd-session[228049]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=101.47.181.100  user=root
Oct 01 14:33:12 compute-0 podman[228052]: 2025-10-01 14:33:12.153004953 +0000 UTC m=+0.062977606 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, release=1755695350, config_id=edpm, vcs-type=git, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7)
Oct 01 14:33:13 compute-0 sshd-session[228049]: Failed password for root from 101.47.181.100 port 58724 ssh2
Oct 01 14:33:13 compute-0 nova_compute[192698]: 2025-10-01 14:33:13.768 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:33:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:33:14.300 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:33:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:33:14.300 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:33:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:33:14.300 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:33:15 compute-0 nova_compute[192698]: 2025-10-01 14:33:15.038 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:33:16 compute-0 sshd-session[228049]: Connection closed by authenticating user root 101.47.181.100 port 58724 [preauth]
Oct 01 14:33:17 compute-0 podman[228076]: 2025-10-01 14:33:17.170087647 +0000 UTC m=+0.076731757 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.build-date=20250930, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 01 14:33:17 compute-0 podman[228077]: 2025-10-01 14:33:17.170248031 +0000 UTC m=+0.074066165 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct 01 14:33:18 compute-0 nova_compute[192698]: 2025-10-01 14:33:18.769 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:33:20 compute-0 nova_compute[192698]: 2025-10-01 14:33:20.041 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:33:20 compute-0 unix_chkpwd[228113]: password check failed for user (root)
Oct 01 14:33:20 compute-0 sshd-session[228074]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=101.47.181.100  user=root
Oct 01 14:33:22 compute-0 sshd-session[228074]: Failed password for root from 101.47.181.100 port 46724 ssh2
Oct 01 14:33:23 compute-0 podman[228114]: 2025-10-01 14:33:23.167201206 +0000 UTC m=+0.077760574 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 01 14:33:23 compute-0 sshd-session[228074]: Connection closed by authenticating user root 101.47.181.100 port 46724 [preauth]
Oct 01 14:33:23 compute-0 nova_compute[192698]: 2025-10-01 14:33:23.773 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:33:25 compute-0 nova_compute[192698]: 2025-10-01 14:33:25.042 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:33:28 compute-0 nova_compute[192698]: 2025-10-01 14:33:28.776 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:33:29 compute-0 podman[203144]: time="2025-10-01T14:33:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:33:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:33:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:33:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:33:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3030 "" "Go-http-client/1.1"
Oct 01 14:33:30 compute-0 nova_compute[192698]: 2025-10-01 14:33:30.044 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:33:31 compute-0 openstack_network_exporter[205307]: ERROR   14:33:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:33:31 compute-0 openstack_network_exporter[205307]: ERROR   14:33:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:33:31 compute-0 openstack_network_exporter[205307]: ERROR   14:33:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:33:31 compute-0 openstack_network_exporter[205307]: ERROR   14:33:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:33:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:33:31 compute-0 openstack_network_exporter[205307]: ERROR   14:33:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:33:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:33:32 compute-0 nova_compute[192698]: 2025-10-01 14:33:32.946 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:33:32 compute-0 nova_compute[192698]: 2025-10-01 14:33:32.947 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:33:32 compute-0 nova_compute[192698]: 2025-10-01 14:33:32.947 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:33:33 compute-0 nova_compute[192698]: 2025-10-01 14:33:33.509 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:33:33 compute-0 nova_compute[192698]: 2025-10-01 14:33:33.510 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:33:33 compute-0 nova_compute[192698]: 2025-10-01 14:33:33.511 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:33:33 compute-0 nova_compute[192698]: 2025-10-01 14:33:33.511 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 01 14:33:33 compute-0 nova_compute[192698]: 2025-10-01 14:33:33.701 2 WARNING nova.virt.libvirt.driver [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:33:33 compute-0 nova_compute[192698]: 2025-10-01 14:33:33.703 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:33:33 compute-0 nova_compute[192698]: 2025-10-01 14:33:33.748 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.045s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:33:33 compute-0 nova_compute[192698]: 2025-10-01 14:33:33.750 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5815MB free_disk=73.3020133972168GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 01 14:33:33 compute-0 nova_compute[192698]: 2025-10-01 14:33:33.750 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:33:33 compute-0 nova_compute[192698]: 2025-10-01 14:33:33.751 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:33:33 compute-0 nova_compute[192698]: 2025-10-01 14:33:33.779 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:33:34 compute-0 unix_chkpwd[228143]: password check failed for user (root)
Oct 01 14:33:34 compute-0 sshd-session[228140]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=101.47.181.100  user=root
Oct 01 14:33:34 compute-0 nova_compute[192698]: 2025-10-01 14:33:34.801 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 01 14:33:34 compute-0 nova_compute[192698]: 2025-10-01 14:33:34.802 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:33:33 up  1:32,  0 user,  load average: 0.05, 0.14, 0.23\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 01 14:33:34 compute-0 nova_compute[192698]: 2025-10-01 14:33:34.830 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Refreshing inventories for resource provider ee1e54f5-453b-4949-a499-9a192f03b8f0 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Oct 01 14:33:34 compute-0 nova_compute[192698]: 2025-10-01 14:33:34.850 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Updating ProviderTree inventory for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Oct 01 14:33:34 compute-0 nova_compute[192698]: 2025-10-01 14:33:34.850 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Updating inventory in ProviderTree for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Oct 01 14:33:34 compute-0 nova_compute[192698]: 2025-10-01 14:33:34.860 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Refreshing aggregate associations for resource provider ee1e54f5-453b-4949-a499-9a192f03b8f0, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Oct 01 14:33:34 compute-0 nova_compute[192698]: 2025-10-01 14:33:34.896 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Refreshing trait associations for resource provider ee1e54f5-453b-4949-a499-9a192f03b8f0, traits: COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_TPM_TIS,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_BMI2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_AVX2,COMPUTE_ARCH_X86_64,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SHA,COMPUTE_SOUND_MODEL_AC97,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_SOUND_MODEL_ES1370,HW_ARCH_X86_64,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_DEVICE_TAGGING,COMPUTE_SECURITY_TPM_CRB,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SOUND_MODEL_SB16,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SOUND_MODEL_USB,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_ACCELERATORS,HW_CPU_X86_CLMUL,HW_CPU_X86_AESNI,COMPUTE_NODE,HW_CPU_X86_SSSE3,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_RESCUE_BFV,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_FMA3,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_AVX,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_F16C,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_ABM,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_VIRTIO_FS,HW_CPU_X86_SSE2,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE4A,HW_CPU_X86_SVM _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Oct 01 14:33:34 compute-0 nova_compute[192698]: 2025-10-01 14:33:34.914 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:33:35 compute-0 nova_compute[192698]: 2025-10-01 14:33:35.048 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:33:35 compute-0 podman[228144]: 2025-10-01 14:33:35.152228058 +0000 UTC m=+0.061953618 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 01 14:33:35 compute-0 podman[228145]: 2025-10-01 14:33:35.222813648 +0000 UTC m=+0.128832109 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 01 14:33:35 compute-0 nova_compute[192698]: 2025-10-01 14:33:35.422 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:33:35 compute-0 nova_compute[192698]: 2025-10-01 14:33:35.932 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 01 14:33:35 compute-0 nova_compute[192698]: 2025-10-01 14:33:35.933 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.182s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:33:35 compute-0 nova_compute[192698]: 2025-10-01 14:33:35.933 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:33:35 compute-0 nova_compute[192698]: 2025-10-01 14:33:35.933 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Oct 01 14:33:36 compute-0 sshd-session[228140]: Failed password for root from 101.47.181.100 port 33328 ssh2
Oct 01 14:33:37 compute-0 nova_compute[192698]: 2025-10-01 14:33:37.418 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:33:37 compute-0 nova_compute[192698]: 2025-10-01 14:33:37.419 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:33:37 compute-0 sshd-session[228140]: Connection closed by authenticating user root 101.47.181.100 port 33328 [preauth]
Oct 01 14:33:38 compute-0 nova_compute[192698]: 2025-10-01 14:33:38.804 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:33:40 compute-0 nova_compute[192698]: 2025-10-01 14:33:40.050 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:33:40 compute-0 unix_chkpwd[228192]: password check failed for user (root)
Oct 01 14:33:40 compute-0 sshd-session[228190]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=101.47.181.100  user=root
Oct 01 14:33:40 compute-0 nova_compute[192698]: 2025-10-01 14:33:40.914 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:33:40 compute-0 nova_compute[192698]: 2025-10-01 14:33:40.924 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:33:41 compute-0 nova_compute[192698]: 2025-10-01 14:33:41.913 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:33:41 compute-0 sshd-session[228190]: Failed password for root from 101.47.181.100 port 50218 ssh2
Oct 01 14:33:42 compute-0 nova_compute[192698]: 2025-10-01 14:33:42.576 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:33:42 compute-0 nova_compute[192698]: 2025-10-01 14:33:42.576 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 01 14:33:43 compute-0 podman[228193]: 2025-10-01 14:33:43.167504129 +0000 UTC m=+0.074001173 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, container_name=openstack_network_exporter, release=1755695350, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct 01 14:33:43 compute-0 sshd-session[228190]: Connection closed by authenticating user root 101.47.181.100 port 50218 [preauth]
Oct 01 14:33:43 compute-0 nova_compute[192698]: 2025-10-01 14:33:43.806 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:33:43 compute-0 nova_compute[192698]: 2025-10-01 14:33:43.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:33:45 compute-0 nova_compute[192698]: 2025-10-01 14:33:45.052 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:33:47 compute-0 nova_compute[192698]: 2025-10-01 14:33:47.778 2 DEBUG nova.virt.libvirt.driver [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9dcd8b44-4daa-408b-a130-bf9c003d0750] Creating tmpfile /var/lib/nova/instances/tmp54a07ioa to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Oct 01 14:33:47 compute-0 nova_compute[192698]: 2025-10-01 14:33:47.779 2 WARNING neutronclient.v2_0.client [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:33:47 compute-0 nova_compute[192698]: 2025-10-01 14:33:47.784 2 DEBUG nova.compute.manager [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp54a07ioa',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Oct 01 14:33:48 compute-0 podman[228217]: 2025-10-01 14:33:48.161270904 +0000 UTC m=+0.073780337 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Oct 01 14:33:48 compute-0 podman[228218]: 2025-10-01 14:33:48.161470719 +0000 UTC m=+0.068778302 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_id=multipathd)
Oct 01 14:33:48 compute-0 nova_compute[192698]: 2025-10-01 14:33:48.808 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:33:49 compute-0 nova_compute[192698]: 2025-10-01 14:33:49.826 2 WARNING neutronclient.v2_0.client [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:33:50 compute-0 nova_compute[192698]: 2025-10-01 14:33:50.054 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:33:50 compute-0 unix_chkpwd[228253]: password check failed for user (root)
Oct 01 14:33:50 compute-0 sshd-session[228215]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=101.47.181.100  user=root
Oct 01 14:33:53 compute-0 sshd-session[228215]: Failed password for root from 101.47.181.100 port 50222 ssh2
Oct 01 14:33:53 compute-0 nova_compute[192698]: 2025-10-01 14:33:53.859 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:33:54 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Oct 01 14:33:54 compute-0 podman[228255]: 2025-10-01 14:33:54.134117321 +0000 UTC m=+0.073255933 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 01 14:33:54 compute-0 nova_compute[192698]: 2025-10-01 14:33:54.450 2 DEBUG nova.compute.manager [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp54a07ioa',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='9dcd8b44-4daa-408b-a130-bf9c003d0750',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Oct 01 14:33:55 compute-0 nova_compute[192698]: 2025-10-01 14:33:55.092 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:33:55 compute-0 nova_compute[192698]: 2025-10-01 14:33:55.432 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:33:55 compute-0 nova_compute[192698]: 2025-10-01 14:33:55.433 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Oct 01 14:33:55 compute-0 nova_compute[192698]: 2025-10-01 14:33:55.467 2 DEBUG oslo_concurrency.lockutils [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "refresh_cache-9dcd8b44-4daa-408b-a130-bf9c003d0750" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 01 14:33:55 compute-0 nova_compute[192698]: 2025-10-01 14:33:55.467 2 DEBUG oslo_concurrency.lockutils [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquired lock "refresh_cache-9dcd8b44-4daa-408b-a130-bf9c003d0750" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 01 14:33:55 compute-0 nova_compute[192698]: 2025-10-01 14:33:55.468 2 DEBUG nova.network.neutron [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9dcd8b44-4daa-408b-a130-bf9c003d0750] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 01 14:33:55 compute-0 nova_compute[192698]: 2025-10-01 14:33:55.942 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Oct 01 14:33:55 compute-0 nova_compute[192698]: 2025-10-01 14:33:55.974 2 WARNING neutronclient.v2_0.client [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:33:56 compute-0 sshd-session[228215]: Connection closed by authenticating user root 101.47.181.100 port 50222 [preauth]
Oct 01 14:33:56 compute-0 nova_compute[192698]: 2025-10-01 14:33:56.902 2 WARNING neutronclient.v2_0.client [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:33:57 compute-0 nova_compute[192698]: 2025-10-01 14:33:57.035 2 DEBUG nova.network.neutron [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9dcd8b44-4daa-408b-a130-bf9c003d0750] Updating instance_info_cache with network_info: [{"id": "a53c1a86-fb01-426e-9566-47e7ed07a37a", "address": "fa:16:3e:01:c0:62", "network": {"id": "555191a0-aa04-49d4-af46-93b0ff584e2d", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-742538066-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c0ce6811a65d40628bfc69d5eb9bcf01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa53c1a86-fb", "ovs_interfaceid": "a53c1a86-fb01-426e-9566-47e7ed07a37a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:33:57 compute-0 nova_compute[192698]: 2025-10-01 14:33:57.542 2 DEBUG oslo_concurrency.lockutils [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Releasing lock "refresh_cache-9dcd8b44-4daa-408b-a130-bf9c003d0750" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 01 14:33:57 compute-0 nova_compute[192698]: 2025-10-01 14:33:57.557 2 DEBUG nova.virt.libvirt.driver [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9dcd8b44-4daa-408b-a130-bf9c003d0750] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp54a07ioa',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='9dcd8b44-4daa-408b-a130-bf9c003d0750',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Oct 01 14:33:57 compute-0 nova_compute[192698]: 2025-10-01 14:33:57.558 2 DEBUG nova.virt.libvirt.driver [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9dcd8b44-4daa-408b-a130-bf9c003d0750] Creating instance directory: /var/lib/nova/instances/9dcd8b44-4daa-408b-a130-bf9c003d0750 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Oct 01 14:33:57 compute-0 nova_compute[192698]: 2025-10-01 14:33:57.558 2 DEBUG nova.virt.libvirt.driver [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9dcd8b44-4daa-408b-a130-bf9c003d0750] Creating disk.info with the contents: {'/var/lib/nova/instances/9dcd8b44-4daa-408b-a130-bf9c003d0750/disk': 'qcow2', '/var/lib/nova/instances/9dcd8b44-4daa-408b-a130-bf9c003d0750/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Oct 01 14:33:57 compute-0 nova_compute[192698]: 2025-10-01 14:33:57.559 2 DEBUG nova.virt.libvirt.driver [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9dcd8b44-4daa-408b-a130-bf9c003d0750] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Oct 01 14:33:57 compute-0 nova_compute[192698]: 2025-10-01 14:33:57.559 2 DEBUG nova.objects.instance [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 9dcd8b44-4daa-408b-a130-bf9c003d0750 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:33:58 compute-0 unix_chkpwd[228282]: password check failed for user (root)
Oct 01 14:33:58 compute-0 sshd-session[228280]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=101.47.181.100  user=root
Oct 01 14:33:58 compute-0 nova_compute[192698]: 2025-10-01 14:33:58.066 2 DEBUG oslo_utils.imageutils.format_inspector [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:33:58 compute-0 nova_compute[192698]: 2025-10-01 14:33:58.072 2 DEBUG oslo_utils.imageutils.format_inspector [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:33:58 compute-0 nova_compute[192698]: 2025-10-01 14:33:58.075 2 DEBUG oslo_concurrency.processutils [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:33:58 compute-0 nova_compute[192698]: 2025-10-01 14:33:58.166 2 DEBUG oslo_concurrency.processutils [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:33:58 compute-0 nova_compute[192698]: 2025-10-01 14:33:58.168 2 DEBUG oslo_concurrency.lockutils [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "f477473ce09fdc00484ca839f539813eb2fee546" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:33:58 compute-0 nova_compute[192698]: 2025-10-01 14:33:58.169 2 DEBUG oslo_concurrency.lockutils [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "f477473ce09fdc00484ca839f539813eb2fee546" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:33:58 compute-0 nova_compute[192698]: 2025-10-01 14:33:58.169 2 DEBUG oslo_utils.imageutils.format_inspector [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:33:58 compute-0 nova_compute[192698]: 2025-10-01 14:33:58.172 2 DEBUG oslo_utils.imageutils.format_inspector [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:33:58 compute-0 nova_compute[192698]: 2025-10-01 14:33:58.173 2 DEBUG oslo_concurrency.processutils [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:33:58 compute-0 nova_compute[192698]: 2025-10-01 14:33:58.245 2 DEBUG oslo_concurrency.processutils [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:33:58 compute-0 nova_compute[192698]: 2025-10-01 14:33:58.246 2 DEBUG oslo_concurrency.processutils [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546,backing_fmt=raw /var/lib/nova/instances/9dcd8b44-4daa-408b-a130-bf9c003d0750/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:33:58 compute-0 nova_compute[192698]: 2025-10-01 14:33:58.292 2 DEBUG oslo_concurrency.processutils [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546,backing_fmt=raw /var/lib/nova/instances/9dcd8b44-4daa-408b-a130-bf9c003d0750/disk 1073741824" returned: 0 in 0.045s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:33:58 compute-0 nova_compute[192698]: 2025-10-01 14:33:58.293 2 DEBUG oslo_concurrency.lockutils [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "f477473ce09fdc00484ca839f539813eb2fee546" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.124s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:33:58 compute-0 nova_compute[192698]: 2025-10-01 14:33:58.293 2 DEBUG oslo_concurrency.processutils [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:33:58 compute-0 nova_compute[192698]: 2025-10-01 14:33:58.359 2 DEBUG oslo_concurrency.processutils [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:33:58 compute-0 nova_compute[192698]: 2025-10-01 14:33:58.360 2 DEBUG nova.virt.disk.api [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Checking if we can resize image /var/lib/nova/instances/9dcd8b44-4daa-408b-a130-bf9c003d0750/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 01 14:33:58 compute-0 nova_compute[192698]: 2025-10-01 14:33:58.360 2 DEBUG oslo_concurrency.processutils [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9dcd8b44-4daa-408b-a130-bf9c003d0750/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:33:58 compute-0 nova_compute[192698]: 2025-10-01 14:33:58.425 2 DEBUG oslo_concurrency.processutils [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9dcd8b44-4daa-408b-a130-bf9c003d0750/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:33:58 compute-0 nova_compute[192698]: 2025-10-01 14:33:58.427 2 DEBUG nova.virt.disk.api [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Cannot resize image /var/lib/nova/instances/9dcd8b44-4daa-408b-a130-bf9c003d0750/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 01 14:33:58 compute-0 nova_compute[192698]: 2025-10-01 14:33:58.428 2 DEBUG nova.objects.instance [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lazy-loading 'migration_context' on Instance uuid 9dcd8b44-4daa-408b-a130-bf9c003d0750 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:33:58 compute-0 nova_compute[192698]: 2025-10-01 14:33:58.862 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:33:58 compute-0 nova_compute[192698]: 2025-10-01 14:33:58.937 2 DEBUG nova.objects.base [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Object Instance<9dcd8b44-4daa-408b-a130-bf9c003d0750> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Oct 01 14:33:58 compute-0 nova_compute[192698]: 2025-10-01 14:33:58.938 2 DEBUG oslo_concurrency.processutils [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/9dcd8b44-4daa-408b-a130-bf9c003d0750/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:33:58 compute-0 nova_compute[192698]: 2025-10-01 14:33:58.979 2 DEBUG oslo_concurrency.processutils [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/9dcd8b44-4daa-408b-a130-bf9c003d0750/disk.config 497664" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:33:58 compute-0 nova_compute[192698]: 2025-10-01 14:33:58.980 2 DEBUG nova.virt.libvirt.driver [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9dcd8b44-4daa-408b-a130-bf9c003d0750] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Oct 01 14:33:58 compute-0 nova_compute[192698]: 2025-10-01 14:33:58.981 2 DEBUG nova.virt.libvirt.vif [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-10-01T14:32:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-395021213',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-395021213',id=30,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-01T14:33:11Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='212993276c39412c938b179b82d692f2',ramdisk_id='',reservation_id='r-hr5wqkk0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-1252378341',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-1252378341-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-10-01T14:33:11Z,user_data=None,user_id='2c2679cca0d247d1828f85f7ce3bb197',uuid=9dcd8b44-4daa-408b-a130-bf9c003d0750,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a53c1a86-fb01-426e-9566-47e7ed07a37a", "address": "fa:16:3e:01:c0:62", "network": {"id": "555191a0-aa04-49d4-af46-93b0ff584e2d", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-742538066-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c0ce6811a65d40628bfc69d5eb9bcf01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapa53c1a86-fb", "ovs_interfaceid": "a53c1a86-fb01-426e-9566-47e7ed07a37a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 01 14:33:58 compute-0 nova_compute[192698]: 2025-10-01 14:33:58.982 2 DEBUG nova.network.os_vif_util [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Converting VIF {"id": "a53c1a86-fb01-426e-9566-47e7ed07a37a", "address": "fa:16:3e:01:c0:62", "network": {"id": "555191a0-aa04-49d4-af46-93b0ff584e2d", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-742538066-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c0ce6811a65d40628bfc69d5eb9bcf01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapa53c1a86-fb", "ovs_interfaceid": "a53c1a86-fb01-426e-9566-47e7ed07a37a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:33:58 compute-0 nova_compute[192698]: 2025-10-01 14:33:58.983 2 DEBUG nova.network.os_vif_util [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:01:c0:62,bridge_name='br-int',has_traffic_filtering=True,id=a53c1a86-fb01-426e-9566-47e7ed07a37a,network=Network(555191a0-aa04-49d4-af46-93b0ff584e2d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa53c1a86-fb') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:33:58 compute-0 nova_compute[192698]: 2025-10-01 14:33:58.983 2 DEBUG os_vif [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:01:c0:62,bridge_name='br-int',has_traffic_filtering=True,id=a53c1a86-fb01-426e-9566-47e7ed07a37a,network=Network(555191a0-aa04-49d4-af46-93b0ff584e2d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa53c1a86-fb') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 01 14:33:58 compute-0 nova_compute[192698]: 2025-10-01 14:33:58.984 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:33:58 compute-0 nova_compute[192698]: 2025-10-01 14:33:58.985 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:33:58 compute-0 nova_compute[192698]: 2025-10-01 14:33:58.985 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:33:58 compute-0 nova_compute[192698]: 2025-10-01 14:33:58.986 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:33:58 compute-0 nova_compute[192698]: 2025-10-01 14:33:58.986 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '86baf53f-3f10-5f0e-be4b-7413d20c3258', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:33:58 compute-0 nova_compute[192698]: 2025-10-01 14:33:58.988 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:33:58 compute-0 nova_compute[192698]: 2025-10-01 14:33:58.989 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:33:58 compute-0 nova_compute[192698]: 2025-10-01 14:33:58.991 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:33:58 compute-0 nova_compute[192698]: 2025-10-01 14:33:58.992 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa53c1a86-fb, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:33:58 compute-0 nova_compute[192698]: 2025-10-01 14:33:58.993 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapa53c1a86-fb, col_values=(('qos', UUID('7f76d115-ff6f-42d6-97a4-eda25b209b9d')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:33:58 compute-0 nova_compute[192698]: 2025-10-01 14:33:58.993 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapa53c1a86-fb, col_values=(('external_ids', {'iface-id': 'a53c1a86-fb01-426e-9566-47e7ed07a37a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:01:c0:62', 'vm-uuid': '9dcd8b44-4daa-408b-a130-bf9c003d0750'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:33:58 compute-0 nova_compute[192698]: 2025-10-01 14:33:58.994 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:33:58 compute-0 NetworkManager[51741]: <info>  [1759329238.9965] manager: (tapa53c1a86-fb): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/92)
Oct 01 14:33:58 compute-0 nova_compute[192698]: 2025-10-01 14:33:58.997 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:33:59 compute-0 nova_compute[192698]: 2025-10-01 14:33:59.004 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:33:59 compute-0 nova_compute[192698]: 2025-10-01 14:33:59.005 2 INFO os_vif [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:01:c0:62,bridge_name='br-int',has_traffic_filtering=True,id=a53c1a86-fb01-426e-9566-47e7ed07a37a,network=Network(555191a0-aa04-49d4-af46-93b0ff584e2d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa53c1a86-fb')
Oct 01 14:33:59 compute-0 nova_compute[192698]: 2025-10-01 14:33:59.006 2 DEBUG nova.virt.libvirt.driver [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Oct 01 14:33:59 compute-0 nova_compute[192698]: 2025-10-01 14:33:59.006 2 DEBUG nova.compute.manager [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp54a07ioa',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='9dcd8b44-4daa-408b-a130-bf9c003d0750',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Oct 01 14:33:59 compute-0 nova_compute[192698]: 2025-10-01 14:33:59.007 2 WARNING neutronclient.v2_0.client [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:33:59 compute-0 nova_compute[192698]: 2025-10-01 14:33:59.327 2 WARNING neutronclient.v2_0.client [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:33:59 compute-0 podman[203144]: time="2025-10-01T14:33:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:33:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:33:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:33:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:33:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3030 "" "Go-http-client/1.1"
Oct 01 14:34:00 compute-0 nova_compute[192698]: 2025-10-01 14:34:00.031 2 DEBUG nova.network.neutron [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9dcd8b44-4daa-408b-a130-bf9c003d0750] Port a53c1a86-fb01-426e-9566-47e7ed07a37a updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Oct 01 14:34:00 compute-0 nova_compute[192698]: 2025-10-01 14:34:00.053 2 DEBUG nova.compute.manager [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp54a07ioa',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='9dcd8b44-4daa-408b-a130-bf9c003d0750',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Oct 01 14:34:00 compute-0 nova_compute[192698]: 2025-10-01 14:34:00.093 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:34:00 compute-0 sshd-session[228280]: Failed password for root from 101.47.181.100 port 37736 ssh2
Oct 01 14:34:00 compute-0 sshd-session[228280]: Connection closed by authenticating user root 101.47.181.100 port 37736 [preauth]
Oct 01 14:34:01 compute-0 openstack_network_exporter[205307]: ERROR   14:34:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:34:01 compute-0 openstack_network_exporter[205307]: ERROR   14:34:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:34:01 compute-0 openstack_network_exporter[205307]: ERROR   14:34:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:34:01 compute-0 openstack_network_exporter[205307]: ERROR   14:34:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:34:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:34:01 compute-0 openstack_network_exporter[205307]: ERROR   14:34:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:34:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:34:03 compute-0 systemd[1]: Starting libvirt proxy daemon...
Oct 01 14:34:03 compute-0 systemd[1]: Started libvirt proxy daemon.
Oct 01 14:34:03 compute-0 kernel: tapa53c1a86-fb: entered promiscuous mode
Oct 01 14:34:03 compute-0 ovn_controller[94909]: 2025-10-01T14:34:03Z|00249|binding|INFO|Claiming lport a53c1a86-fb01-426e-9566-47e7ed07a37a for this additional chassis.
Oct 01 14:34:03 compute-0 ovn_controller[94909]: 2025-10-01T14:34:03Z|00250|binding|INFO|a53c1a86-fb01-426e-9566-47e7ed07a37a: Claiming fa:16:3e:01:c0:62 10.100.0.9
Oct 01 14:34:03 compute-0 nova_compute[192698]: 2025-10-01 14:34:03.491 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:34:03 compute-0 NetworkManager[51741]: <info>  [1759329243.4955] manager: (tapa53c1a86-fb): new Tun device (/org/freedesktop/NetworkManager/Devices/93)
Oct 01 14:34:03 compute-0 nova_compute[192698]: 2025-10-01 14:34:03.495 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:34:03.510 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:01:c0:62 10.100.0.9'], port_security=['fa:16:3e:01:c0:62 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '9dcd8b44-4daa-408b-a130-bf9c003d0750', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-555191a0-aa04-49d4-af46-93b0ff584e2d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '212993276c39412c938b179b82d692f2', 'neutron:revision_number': '10', 'neutron:security_group_ids': '134d03e5-3921-4f89-b7a4-e938969031c7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3857fb1d-1e54-4a99-a828-506b6bdd5885, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=a53c1a86-fb01-426e-9566-47e7ed07a37a) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:34:03.511 103791 INFO neutron.agent.ovn.metadata.agent [-] Port a53c1a86-fb01-426e-9566-47e7ed07a37a in datapath 555191a0-aa04-49d4-af46-93b0ff584e2d unbound from our chassis
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:34:03.512 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 555191a0-aa04-49d4-af46-93b0ff584e2d
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:34:03.538 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[f3e9dcd3-1f1e-465c-9cba-8755b26d2a08]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:34:03.539 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap555191a0-a1 in ovnmeta-555191a0-aa04-49d4-af46-93b0ff584e2d namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:34:03.542 214114 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap555191a0-a0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:34:03.542 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[2f0d67e5-9305-4c46-8c6e-22a1ccffe89d]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:34:03.543 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[166145d0-bf99-4108-a86d-8466e7f49bd9]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:34:03 compute-0 systemd-machined[152704]: New machine qemu-24-instance-0000001e.
Oct 01 14:34:03 compute-0 systemd-udevd[228339]: Network interface NamePolicy= disabled on kernel command line.
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:34:03.564 103910 DEBUG oslo.privsep.daemon [-] privsep: reply[df52f359-600c-4c8f-9c47-1c5834c4acef]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:34:03 compute-0 nova_compute[192698]: 2025-10-01 14:34:03.570 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:34:03 compute-0 ovn_controller[94909]: 2025-10-01T14:34:03Z|00251|binding|INFO|Setting lport a53c1a86-fb01-426e-9566-47e7ed07a37a ovn-installed in OVS
Oct 01 14:34:03 compute-0 nova_compute[192698]: 2025-10-01 14:34:03.577 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:34:03 compute-0 systemd[1]: Started Virtual Machine qemu-24-instance-0000001e.
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:34:03.587 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[0c3eae66-11b0-4366-8fb2-7c74f8d7a782]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:34:03 compute-0 NetworkManager[51741]: <info>  [1759329243.5888] device (tapa53c1a86-fb): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 01 14:34:03 compute-0 NetworkManager[51741]: <info>  [1759329243.5903] device (tapa53c1a86-fb): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:34:03.637 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[1fbb8dea-5b4e-4b70-8994-9b0f26aa443b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:34:03.644 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[1f979ef5-be6b-484c-a694-d32616783725]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:34:03 compute-0 NetworkManager[51741]: <info>  [1759329243.6470] manager: (tap555191a0-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/94)
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:34:03.686 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[61af80e2-7651-45f2-8fb8-2b494a01b3b1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:34:03.690 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[a7475416-cea9-4ddd-a5c5-645e0c1122c5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:34:03 compute-0 NetworkManager[51741]: <info>  [1759329243.7205] device (tap555191a0-a0): carrier: link connected
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:34:03.726 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[ff9344cd-8211-4ee0-b6de-c8d2e9cf0e28]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:34:03.750 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[291cf6ee-077d-4668-a432-23849eaa71fa]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap555191a0-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f5:d1:eb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 72], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 560425, 'reachable_time': 17058, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228370, 'error': None, 'target': 'ovnmeta-555191a0-aa04-49d4-af46-93b0ff584e2d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:34:03.772 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[e6cdc203-754c-4c37-ad67-db3df0a8efdf]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef5:d1eb'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 560425, 'tstamp': 560425}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228371, 'error': None, 'target': 'ovnmeta-555191a0-aa04-49d4-af46-93b0ff584e2d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:34:03.792 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[7a27c750-e08c-4381-a596-c4141fb7031f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap555191a0-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f5:d1:eb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 72], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 560425, 'reachable_time': 17058, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 228373, 'error': None, 'target': 'ovnmeta-555191a0-aa04-49d4-af46-93b0ff584e2d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:34:03.834 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[99a3e4a1-f231-443e-a834-c5566f6cada5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:34:03.923 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[c3e8354e-9a9e-4de2-a17e-8c43d3c715e4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:34:03.925 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap555191a0-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:34:03.925 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:34:03.925 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap555191a0-a0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:34:03 compute-0 nova_compute[192698]: 2025-10-01 14:34:03.927 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:34:03 compute-0 NetworkManager[51741]: <info>  [1759329243.9290] manager: (tap555191a0-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/95)
Oct 01 14:34:03 compute-0 kernel: tap555191a0-a0: entered promiscuous mode
Oct 01 14:34:03 compute-0 nova_compute[192698]: 2025-10-01 14:34:03.931 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:34:03.933 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap555191a0-a0, col_values=(('external_ids', {'iface-id': '0a30025f-995f-4cd9-a2bb-85c8c480de92'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:34:03 compute-0 nova_compute[192698]: 2025-10-01 14:34:03.935 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:34:03 compute-0 ovn_controller[94909]: 2025-10-01T14:34:03Z|00252|binding|INFO|Releasing lport 0a30025f-995f-4cd9-a2bb-85c8c480de92 from this chassis (sb_readonly=0)
Oct 01 14:34:03 compute-0 nova_compute[192698]: 2025-10-01 14:34:03.936 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:34:03.938 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[4fc09c6d-c039-45c8-8c89-eee4944fd2d2]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:34:03.939 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/555191a0-aa04-49d4-af46-93b0ff584e2d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/555191a0-aa04-49d4-af46-93b0ff584e2d.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:34:03.939 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/555191a0-aa04-49d4-af46-93b0ff584e2d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/555191a0-aa04-49d4-af46-93b0ff584e2d.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:34:03.939 103791 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 555191a0-aa04-49d4-af46-93b0ff584e2d disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:34:03.940 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/555191a0-aa04-49d4-af46-93b0ff584e2d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/555191a0-aa04-49d4-af46-93b0ff584e2d.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:34:03.940 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[22f89564-8e55-4fc3-a369-d29d6fb9f6f9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:34:03.941 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/555191a0-aa04-49d4-af46-93b0ff584e2d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/555191a0-aa04-49d4-af46-93b0ff584e2d.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:34:03.941 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[2294448e-eb00-47e6-8058-77312d7cb4da]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:34:03.942 103791 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]: global
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]:     log         /dev/log local0 debug
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]:     log-tag     haproxy-metadata-proxy-555191a0-aa04-49d4-af46-93b0ff584e2d
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]:     user        root
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]:     group       root
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]:     maxconn     1024
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]:     pidfile     /var/lib/neutron/external/pids/555191a0-aa04-49d4-af46-93b0ff584e2d.pid.haproxy
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]:     daemon
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]: 
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]: defaults
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]:     log global
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]:     mode http
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]:     option httplog
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]:     option dontlognull
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]:     option http-server-close
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]:     option forwardfor
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]:     retries                 3
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]:     timeout http-request    30s
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]:     timeout connect         30s
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]:     timeout client          32s
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]:     timeout server          32s
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]:     timeout http-keep-alive 30s
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]: 
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]: listen listener
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]:     bind 169.254.169.254:80
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]:     
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]:     server metadata /var/lib/neutron/metadata_proxy
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]: 
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]:     http-request add-header X-OVN-Network-ID 555191a0-aa04-49d4-af46-93b0ff584e2d
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Oct 01 14:34:03 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:34:03.943 103791 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-555191a0-aa04-49d4-af46-93b0ff584e2d', 'env', 'PROCESS_TAG=haproxy-555191a0-aa04-49d4-af46-93b0ff584e2d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/555191a0-aa04-49d4-af46-93b0ff584e2d.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Oct 01 14:34:03 compute-0 nova_compute[192698]: 2025-10-01 14:34:03.953 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:34:03 compute-0 nova_compute[192698]: 2025-10-01 14:34:03.996 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:34:04 compute-0 podman[228411]: 2025-10-01 14:34:04.398970834 +0000 UTC m=+0.061280610 container create 803de89d4b83195c12e241354e33c388ced39edd95f399ba99994a672726df94 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-555191a0-aa04-49d4-af46-93b0ff584e2d, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest)
Oct 01 14:34:04 compute-0 systemd[1]: Started libpod-conmon-803de89d4b83195c12e241354e33c388ced39edd95f399ba99994a672726df94.scope.
Oct 01 14:34:04 compute-0 podman[228411]: 2025-10-01 14:34:04.367528558 +0000 UTC m=+0.029838374 image pull 0c139338a67144a0d88e07ef5f38b20d3085af4a1586fd8115d3776c8f9c633c 38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Oct 01 14:34:04 compute-0 systemd[1]: Started libcrun container.
Oct 01 14:34:04 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d14013434ab76c7f7d500e99bd4b313e2f8d81f40eafc0daffff2c709867f88b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 01 14:34:04 compute-0 podman[228411]: 2025-10-01 14:34:04.501643387 +0000 UTC m=+0.163953173 container init 803de89d4b83195c12e241354e33c388ced39edd95f399ba99994a672726df94 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-555191a0-aa04-49d4-af46-93b0ff584e2d, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930)
Oct 01 14:34:04 compute-0 podman[228411]: 2025-10-01 14:34:04.512697255 +0000 UTC m=+0.175007061 container start 803de89d4b83195c12e241354e33c388ced39edd95f399ba99994a672726df94 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-555191a0-aa04-49d4-af46-93b0ff584e2d, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Oct 01 14:34:04 compute-0 neutron-haproxy-ovnmeta-555191a0-aa04-49d4-af46-93b0ff584e2d[228426]: [NOTICE]   (228430) : New worker (228432) forked
Oct 01 14:34:04 compute-0 neutron-haproxy-ovnmeta-555191a0-aa04-49d4-af46-93b0ff584e2d[228426]: [NOTICE]   (228430) : Loading success.
Oct 01 14:34:05 compute-0 nova_compute[192698]: 2025-10-01 14:34:05.096 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:34:05 compute-0 unix_chkpwd[228453]: password check failed for user (root)
Oct 01 14:34:05 compute-0 sshd-session[228303]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=101.47.181.100  user=root
Oct 01 14:34:05 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:34:05.990 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=31, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'e2:3f:3c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:1d:a6:67:ed:e6'}, ipsec=False) old=SB_Global(nb_cfg=30) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:34:05 compute-0 nova_compute[192698]: 2025-10-01 14:34:05.991 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:34:05 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:34:05.992 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 01 14:34:06 compute-0 podman[228455]: 2025-10-01 14:34:06.181464562 +0000 UTC m=+0.079475350 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct 01 14:34:06 compute-0 ovn_controller[94909]: 2025-10-01T14:34:06Z|00253|binding|INFO|Claiming lport a53c1a86-fb01-426e-9566-47e7ed07a37a for this chassis.
Oct 01 14:34:06 compute-0 ovn_controller[94909]: 2025-10-01T14:34:06Z|00254|binding|INFO|a53c1a86-fb01-426e-9566-47e7ed07a37a: Claiming fa:16:3e:01:c0:62 10.100.0.9
Oct 01 14:34:06 compute-0 ovn_controller[94909]: 2025-10-01T14:34:06Z|00255|binding|INFO|Setting lport a53c1a86-fb01-426e-9566-47e7ed07a37a up in Southbound
Oct 01 14:34:06 compute-0 podman[228456]: 2025-10-01 14:34:06.272811341 +0000 UTC m=+0.166367480 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 01 14:34:06 compute-0 sshd-session[228303]: Failed password for root from 101.47.181.100 port 33224 ssh2
Oct 01 14:34:07 compute-0 nova_compute[192698]: 2025-10-01 14:34:07.683 2 INFO nova.compute.manager [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9dcd8b44-4daa-408b-a130-bf9c003d0750] Post operation of migration started
Oct 01 14:34:07 compute-0 nova_compute[192698]: 2025-10-01 14:34:07.684 2 WARNING neutronclient.v2_0.client [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:34:08 compute-0 sshd-session[228303]: Connection closed by authenticating user root 101.47.181.100 port 33224 [preauth]
Oct 01 14:34:08 compute-0 nova_compute[192698]: 2025-10-01 14:34:08.628 2 WARNING neutronclient.v2_0.client [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:34:08 compute-0 nova_compute[192698]: 2025-10-01 14:34:08.629 2 WARNING neutronclient.v2_0.client [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:34:08 compute-0 nova_compute[192698]: 2025-10-01 14:34:08.748 2 DEBUG oslo_concurrency.lockutils [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "refresh_cache-9dcd8b44-4daa-408b-a130-bf9c003d0750" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 01 14:34:08 compute-0 nova_compute[192698]: 2025-10-01 14:34:08.749 2 DEBUG oslo_concurrency.lockutils [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquired lock "refresh_cache-9dcd8b44-4daa-408b-a130-bf9c003d0750" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 01 14:34:08 compute-0 nova_compute[192698]: 2025-10-01 14:34:08.749 2 DEBUG nova.network.neutron [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9dcd8b44-4daa-408b-a130-bf9c003d0750] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 01 14:34:08 compute-0 nova_compute[192698]: 2025-10-01 14:34:08.998 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:34:09 compute-0 nova_compute[192698]: 2025-10-01 14:34:09.257 2 WARNING neutronclient.v2_0.client [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:34:09 compute-0 unix_chkpwd[228503]: password check failed for user (root)
Oct 01 14:34:09 compute-0 sshd-session[228501]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=101.47.181.100  user=root
Oct 01 14:34:09 compute-0 nova_compute[192698]: 2025-10-01 14:34:09.886 2 WARNING neutronclient.v2_0.client [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:34:10 compute-0 nova_compute[192698]: 2025-10-01 14:34:10.075 2 DEBUG nova.network.neutron [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9dcd8b44-4daa-408b-a130-bf9c003d0750] Updating instance_info_cache with network_info: [{"id": "a53c1a86-fb01-426e-9566-47e7ed07a37a", "address": "fa:16:3e:01:c0:62", "network": {"id": "555191a0-aa04-49d4-af46-93b0ff584e2d", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-742538066-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c0ce6811a65d40628bfc69d5eb9bcf01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa53c1a86-fb", "ovs_interfaceid": "a53c1a86-fb01-426e-9566-47e7ed07a37a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:34:10 compute-0 nova_compute[192698]: 2025-10-01 14:34:10.097 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:34:10 compute-0 nova_compute[192698]: 2025-10-01 14:34:10.582 2 DEBUG oslo_concurrency.lockutils [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Releasing lock "refresh_cache-9dcd8b44-4daa-408b-a130-bf9c003d0750" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 01 14:34:11 compute-0 nova_compute[192698]: 2025-10-01 14:34:11.104 2 DEBUG oslo_concurrency.lockutils [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:34:11 compute-0 nova_compute[192698]: 2025-10-01 14:34:11.105 2 DEBUG oslo_concurrency.lockutils [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:34:11 compute-0 nova_compute[192698]: 2025-10-01 14:34:11.105 2 DEBUG oslo_concurrency.lockutils [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:34:11 compute-0 nova_compute[192698]: 2025-10-01 14:34:11.112 2 INFO nova.virt.libvirt.driver [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9dcd8b44-4daa-408b-a130-bf9c003d0750] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Oct 01 14:34:11 compute-0 virtqemud[192597]: Domain id=24 name='instance-0000001e' uuid=9dcd8b44-4daa-408b-a130-bf9c003d0750 is tainted: custom-monitor
Oct 01 14:34:11 compute-0 sshd-session[228501]: Failed password for root from 101.47.181.100 port 49468 ssh2
Oct 01 14:34:12 compute-0 nova_compute[192698]: 2025-10-01 14:34:12.122 2 INFO nova.virt.libvirt.driver [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9dcd8b44-4daa-408b-a130-bf9c003d0750] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Oct 01 14:34:12 compute-0 sshd-session[228501]: Connection closed by authenticating user root 101.47.181.100 port 49468 [preauth]
Oct 01 14:34:13 compute-0 nova_compute[192698]: 2025-10-01 14:34:13.132 2 INFO nova.virt.libvirt.driver [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9dcd8b44-4daa-408b-a130-bf9c003d0750] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Oct 01 14:34:13 compute-0 nova_compute[192698]: 2025-10-01 14:34:13.139 2 DEBUG nova.compute.manager [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9dcd8b44-4daa-408b-a130-bf9c003d0750] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 01 14:34:13 compute-0 nova_compute[192698]: 2025-10-01 14:34:13.652 2 DEBUG nova.objects.instance [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9dcd8b44-4daa-408b-a130-bf9c003d0750] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Oct 01 14:34:13 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:34:13.993 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=10cf9814-09fa-4bad-879a-270f9b64eda3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '31'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:34:14 compute-0 nova_compute[192698]: 2025-10-01 14:34:13.999 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:34:14 compute-0 podman[228507]: 2025-10-01 14:34:14.166445836 +0000 UTC m=+0.080354434 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, version=9.6, release=1755695350, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_id=edpm, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public)
Oct 01 14:34:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:34:14.301 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:34:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:34:14.302 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:34:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:34:14.303 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:34:14 compute-0 nova_compute[192698]: 2025-10-01 14:34:14.676 2 WARNING neutronclient.v2_0.client [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:34:15 compute-0 nova_compute[192698]: 2025-10-01 14:34:15.100 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:34:15 compute-0 nova_compute[192698]: 2025-10-01 14:34:15.351 2 WARNING neutronclient.v2_0.client [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:34:15 compute-0 nova_compute[192698]: 2025-10-01 14:34:15.352 2 WARNING neutronclient.v2_0.client [None req-4c0a9d61-a62b-4bfd-95e9-15150fcd760f a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:34:17 compute-0 unix_chkpwd[228530]: password check failed for user (root)
Oct 01 14:34:17 compute-0 sshd-session[228504]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=101.47.181.100  user=root
Oct 01 14:34:18 compute-0 sshd-session[228504]: Failed password for root from 101.47.181.100 port 49484 ssh2
Oct 01 14:34:19 compute-0 nova_compute[192698]: 2025-10-01 14:34:19.001 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:34:19 compute-0 podman[228531]: 2025-10-01 14:34:19.161233616 +0000 UTC m=+0.069870911 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, config_id=iscsid)
Oct 01 14:34:19 compute-0 podman[228532]: 2025-10-01 14:34:19.174067352 +0000 UTC m=+0.076896571 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Oct 01 14:34:20 compute-0 sshd-session[228504]: Connection closed by authenticating user root 101.47.181.100 port 49484 [preauth]
Oct 01 14:34:20 compute-0 nova_compute[192698]: 2025-10-01 14:34:20.103 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:34:24 compute-0 nova_compute[192698]: 2025-10-01 14:34:24.004 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:34:24 compute-0 unix_chkpwd[228572]: password check failed for user (root)
Oct 01 14:34:24 compute-0 sshd-session[228570]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=101.47.181.100  user=root
Oct 01 14:34:24 compute-0 nova_compute[192698]: 2025-10-01 14:34:24.640 2 DEBUG oslo_concurrency.lockutils [None req-4dfc4b76-d977-4a4e-a672-8b36546d460a 2c2679cca0d247d1828f85f7ce3bb197 212993276c39412c938b179b82d692f2 - - default default] Acquiring lock "9dcd8b44-4daa-408b-a130-bf9c003d0750" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:34:24 compute-0 nova_compute[192698]: 2025-10-01 14:34:24.641 2 DEBUG oslo_concurrency.lockutils [None req-4dfc4b76-d977-4a4e-a672-8b36546d460a 2c2679cca0d247d1828f85f7ce3bb197 212993276c39412c938b179b82d692f2 - - default default] Lock "9dcd8b44-4daa-408b-a130-bf9c003d0750" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:34:24 compute-0 nova_compute[192698]: 2025-10-01 14:34:24.641 2 DEBUG oslo_concurrency.lockutils [None req-4dfc4b76-d977-4a4e-a672-8b36546d460a 2c2679cca0d247d1828f85f7ce3bb197 212993276c39412c938b179b82d692f2 - - default default] Acquiring lock "9dcd8b44-4daa-408b-a130-bf9c003d0750-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:34:24 compute-0 nova_compute[192698]: 2025-10-01 14:34:24.641 2 DEBUG oslo_concurrency.lockutils [None req-4dfc4b76-d977-4a4e-a672-8b36546d460a 2c2679cca0d247d1828f85f7ce3bb197 212993276c39412c938b179b82d692f2 - - default default] Lock "9dcd8b44-4daa-408b-a130-bf9c003d0750-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:34:24 compute-0 nova_compute[192698]: 2025-10-01 14:34:24.642 2 DEBUG oslo_concurrency.lockutils [None req-4dfc4b76-d977-4a4e-a672-8b36546d460a 2c2679cca0d247d1828f85f7ce3bb197 212993276c39412c938b179b82d692f2 - - default default] Lock "9dcd8b44-4daa-408b-a130-bf9c003d0750-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:34:24 compute-0 nova_compute[192698]: 2025-10-01 14:34:24.655 2 INFO nova.compute.manager [None req-4dfc4b76-d977-4a4e-a672-8b36546d460a 2c2679cca0d247d1828f85f7ce3bb197 212993276c39412c938b179b82d692f2 - - default default] [instance: 9dcd8b44-4daa-408b-a130-bf9c003d0750] Terminating instance
Oct 01 14:34:25 compute-0 nova_compute[192698]: 2025-10-01 14:34:25.154 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:34:25 compute-0 nova_compute[192698]: 2025-10-01 14:34:25.177 2 DEBUG nova.compute.manager [None req-4dfc4b76-d977-4a4e-a672-8b36546d460a 2c2679cca0d247d1828f85f7ce3bb197 212993276c39412c938b179b82d692f2 - - default default] [instance: 9dcd8b44-4daa-408b-a130-bf9c003d0750] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 01 14:34:25 compute-0 kernel: tapa53c1a86-fb (unregistering): left promiscuous mode
Oct 01 14:34:25 compute-0 NetworkManager[51741]: <info>  [1759329265.2008] device (tapa53c1a86-fb): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 01 14:34:25 compute-0 podman[228573]: 2025-10-01 14:34:25.212600968 +0000 UTC m=+0.116692352 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 01 14:34:25 compute-0 ovn_controller[94909]: 2025-10-01T14:34:25Z|00256|binding|INFO|Releasing lport a53c1a86-fb01-426e-9566-47e7ed07a37a from this chassis (sb_readonly=0)
Oct 01 14:34:25 compute-0 ovn_controller[94909]: 2025-10-01T14:34:25Z|00257|binding|INFO|Setting lport a53c1a86-fb01-426e-9566-47e7ed07a37a down in Southbound
Oct 01 14:34:25 compute-0 ovn_controller[94909]: 2025-10-01T14:34:25Z|00258|binding|INFO|Removing iface tapa53c1a86-fb ovn-installed in OVS
Oct 01 14:34:25 compute-0 nova_compute[192698]: 2025-10-01 14:34:25.214 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:34:25 compute-0 nova_compute[192698]: 2025-10-01 14:34:25.215 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:34:25 compute-0 nova_compute[192698]: 2025-10-01 14:34:25.215 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:34:25 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:34:25.223 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:01:c0:62 10.100.0.9'], port_security=['fa:16:3e:01:c0:62 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '9dcd8b44-4daa-408b-a130-bf9c003d0750', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-555191a0-aa04-49d4-af46-93b0ff584e2d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '212993276c39412c938b179b82d692f2', 'neutron:revision_number': '15', 'neutron:security_group_ids': '134d03e5-3921-4f89-b7a4-e938969031c7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3857fb1d-1e54-4a99-a828-506b6bdd5885, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], logical_port=a53c1a86-fb01-426e-9566-47e7ed07a37a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:34:25 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:34:25.224 103791 INFO neutron.agent.ovn.metadata.agent [-] Port a53c1a86-fb01-426e-9566-47e7ed07a37a in datapath 555191a0-aa04-49d4-af46-93b0ff584e2d unbound from our chassis
Oct 01 14:34:25 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:34:25.224 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 555191a0-aa04-49d4-af46-93b0ff584e2d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 01 14:34:25 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:34:25.225 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[c72c86f1-4005-476a-ad8a-3425232d4c21]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:34:25 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:34:25.227 103791 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-555191a0-aa04-49d4-af46-93b0ff584e2d namespace which is not needed anymore
Oct 01 14:34:25 compute-0 nova_compute[192698]: 2025-10-01 14:34:25.245 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:34:25 compute-0 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d0000001e.scope: Deactivated successfully.
Oct 01 14:34:25 compute-0 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d0000001e.scope: Consumed 2.424s CPU time.
Oct 01 14:34:25 compute-0 systemd-machined[152704]: Machine qemu-24-instance-0000001e terminated.
Oct 01 14:34:25 compute-0 podman[228622]: 2025-10-01 14:34:25.367802075 +0000 UTC m=+0.034031927 container kill 803de89d4b83195c12e241354e33c388ced39edd95f399ba99994a672726df94 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-555191a0-aa04-49d4-af46-93b0ff584e2d, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0)
Oct 01 14:34:25 compute-0 neutron-haproxy-ovnmeta-555191a0-aa04-49d4-af46-93b0ff584e2d[228426]: [NOTICE]   (228430) : haproxy version is 3.0.5-8e879a5
Oct 01 14:34:25 compute-0 neutron-haproxy-ovnmeta-555191a0-aa04-49d4-af46-93b0ff584e2d[228426]: [NOTICE]   (228430) : path to executable is /usr/sbin/haproxy
Oct 01 14:34:25 compute-0 neutron-haproxy-ovnmeta-555191a0-aa04-49d4-af46-93b0ff584e2d[228426]: [WARNING]  (228430) : Exiting Master process...
Oct 01 14:34:25 compute-0 neutron-haproxy-ovnmeta-555191a0-aa04-49d4-af46-93b0ff584e2d[228426]: [ALERT]    (228430) : Current worker (228432) exited with code 143 (Terminated)
Oct 01 14:34:25 compute-0 neutron-haproxy-ovnmeta-555191a0-aa04-49d4-af46-93b0ff584e2d[228426]: [WARNING]  (228430) : All workers exited. Exiting... (0)
Oct 01 14:34:25 compute-0 systemd[1]: libpod-803de89d4b83195c12e241354e33c388ced39edd95f399ba99994a672726df94.scope: Deactivated successfully.
Oct 01 14:34:25 compute-0 podman[228635]: 2025-10-01 14:34:25.423934126 +0000 UTC m=+0.032496986 container died 803de89d4b83195c12e241354e33c388ced39edd95f399ba99994a672726df94 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-555191a0-aa04-49d4-af46-93b0ff584e2d, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.build-date=20250930)
Oct 01 14:34:25 compute-0 nova_compute[192698]: 2025-10-01 14:34:25.446 2 DEBUG nova.compute.manager [req-ea8fb25d-6814-45ba-b782-b270415edbb0 req-d1a9cda5-fae4-498e-b8e9-71b696e074fb 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9dcd8b44-4daa-408b-a130-bf9c003d0750] Received event network-vif-unplugged-a53c1a86-fb01-426e-9566-47e7ed07a37a external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:34:25 compute-0 nova_compute[192698]: 2025-10-01 14:34:25.447 2 DEBUG oslo_concurrency.lockutils [req-ea8fb25d-6814-45ba-b782-b270415edbb0 req-d1a9cda5-fae4-498e-b8e9-71b696e074fb 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "9dcd8b44-4daa-408b-a130-bf9c003d0750-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:34:25 compute-0 nova_compute[192698]: 2025-10-01 14:34:25.447 2 DEBUG oslo_concurrency.lockutils [req-ea8fb25d-6814-45ba-b782-b270415edbb0 req-d1a9cda5-fae4-498e-b8e9-71b696e074fb 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "9dcd8b44-4daa-408b-a130-bf9c003d0750-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:34:25 compute-0 nova_compute[192698]: 2025-10-01 14:34:25.448 2 DEBUG oslo_concurrency.lockutils [req-ea8fb25d-6814-45ba-b782-b270415edbb0 req-d1a9cda5-fae4-498e-b8e9-71b696e074fb 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "9dcd8b44-4daa-408b-a130-bf9c003d0750-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:34:25 compute-0 nova_compute[192698]: 2025-10-01 14:34:25.448 2 DEBUG nova.compute.manager [req-ea8fb25d-6814-45ba-b782-b270415edbb0 req-d1a9cda5-fae4-498e-b8e9-71b696e074fb 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9dcd8b44-4daa-408b-a130-bf9c003d0750] No waiting events found dispatching network-vif-unplugged-a53c1a86-fb01-426e-9566-47e7ed07a37a pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:34:25 compute-0 nova_compute[192698]: 2025-10-01 14:34:25.448 2 DEBUG nova.compute.manager [req-ea8fb25d-6814-45ba-b782-b270415edbb0 req-d1a9cda5-fae4-498e-b8e9-71b696e074fb 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9dcd8b44-4daa-408b-a130-bf9c003d0750] Received event network-vif-unplugged-a53c1a86-fb01-426e-9566-47e7ed07a37a for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 01 14:34:25 compute-0 nova_compute[192698]: 2025-10-01 14:34:25.456 2 INFO nova.virt.libvirt.driver [-] [instance: 9dcd8b44-4daa-408b-a130-bf9c003d0750] Instance destroyed successfully.
Oct 01 14:34:25 compute-0 nova_compute[192698]: 2025-10-01 14:34:25.457 2 DEBUG nova.objects.instance [None req-4dfc4b76-d977-4a4e-a672-8b36546d460a 2c2679cca0d247d1828f85f7ce3bb197 212993276c39412c938b179b82d692f2 - - default default] Lazy-loading 'resources' on Instance uuid 9dcd8b44-4daa-408b-a130-bf9c003d0750 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:34:25 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-803de89d4b83195c12e241354e33c388ced39edd95f399ba99994a672726df94-userdata-shm.mount: Deactivated successfully.
Oct 01 14:34:25 compute-0 systemd[1]: var-lib-containers-storage-overlay-d14013434ab76c7f7d500e99bd4b313e2f8d81f40eafc0daffff2c709867f88b-merged.mount: Deactivated successfully.
Oct 01 14:34:25 compute-0 podman[228635]: 2025-10-01 14:34:25.47385918 +0000 UTC m=+0.082421990 container cleanup 803de89d4b83195c12e241354e33c388ced39edd95f399ba99994a672726df94 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-555191a0-aa04-49d4-af46-93b0ff584e2d, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Oct 01 14:34:25 compute-0 systemd[1]: libpod-conmon-803de89d4b83195c12e241354e33c388ced39edd95f399ba99994a672726df94.scope: Deactivated successfully.
Oct 01 14:34:25 compute-0 podman[228637]: 2025-10-01 14:34:25.496705285 +0000 UTC m=+0.094845014 container remove 803de89d4b83195c12e241354e33c388ced39edd95f399ba99994a672726df94 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-555191a0-aa04-49d4-af46-93b0ff584e2d, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0)
Oct 01 14:34:25 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:34:25.502 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[88c1e129-8755-409a-a2c0-90468bfa0ea9]: (4, ("Wed Oct  1 02:34:25 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-555191a0-aa04-49d4-af46-93b0ff584e2d (803de89d4b83195c12e241354e33c388ced39edd95f399ba99994a672726df94)\n803de89d4b83195c12e241354e33c388ced39edd95f399ba99994a672726df94\nWed Oct  1 02:34:25 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-555191a0-aa04-49d4-af46-93b0ff584e2d (803de89d4b83195c12e241354e33c388ced39edd95f399ba99994a672726df94)\n803de89d4b83195c12e241354e33c388ced39edd95f399ba99994a672726df94\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:34:25 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:34:25.504 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[1ade2cf5-773c-47f8-b6f3-8712c70cd36c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:34:25 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:34:25.505 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/555191a0-aa04-49d4-af46-93b0ff584e2d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/555191a0-aa04-49d4-af46-93b0ff584e2d.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:34:25 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:34:25.505 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[b874026c-ca23-45d7-a288-25781fd714f8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:34:25 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:34:25.506 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap555191a0-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:34:25 compute-0 nova_compute[192698]: 2025-10-01 14:34:25.508 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:34:25 compute-0 kernel: tap555191a0-a0: left promiscuous mode
Oct 01 14:34:25 compute-0 nova_compute[192698]: 2025-10-01 14:34:25.534 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:34:25 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:34:25.536 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[00cc4b74-05e0-4d92-8dbb-e8b359b248c3]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:34:25 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:34:25.574 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[e6062844-01e0-4d14-a46f-cee3226c4123]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:34:25 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:34:25.576 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[0a6ef606-a76c-4fcd-acff-8e20514bc554]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:34:25 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:34:25.592 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[9fdf11f8-238b-412e-bcbe-c3b58d585e46]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 560415, 'reachable_time': 18371, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228686, 'error': None, 'target': 'ovnmeta-555191a0-aa04-49d4-af46-93b0ff584e2d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:34:25 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:34:25.595 103910 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-555191a0-aa04-49d4-af46-93b0ff584e2d deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Oct 01 14:34:25 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:34:25.595 103910 DEBUG oslo.privsep.daemon [-] privsep: reply[b3c80027-8c74-4341-8492-184d7d3c0e2f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:34:25 compute-0 systemd[1]: run-netns-ovnmeta\x2d555191a0\x2daa04\x2d49d4\x2daf46\x2d93b0ff584e2d.mount: Deactivated successfully.
Oct 01 14:34:25 compute-0 nova_compute[192698]: 2025-10-01 14:34:25.964 2 DEBUG nova.virt.libvirt.vif [None req-4dfc4b76-d977-4a4e-a672-8b36546d460a 2c2679cca0d247d1828f85f7ce3bb197 212993276c39412c938b179b82d692f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-10-01T14:32:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-395021213',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-395021213',id=30,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-01T14:33:11Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='212993276c39412c938b179b82d692f2',ramdisk_id='',reservation_id='r-hr5wqkk0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',clean_attempts='1',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-1252378341',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-1252378341-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-01T14:34:14Z,user_data=None,user_id='2c2679cca0d247d1828f85f7ce3bb197',uuid=9dcd8b44-4daa-408b-a130-bf9c003d0750,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a53c1a86-fb01-426e-9566-47e7ed07a37a", "address": "fa:16:3e:01:c0:62", "network": {"id": "555191a0-aa04-49d4-af46-93b0ff584e2d", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-742538066-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c0ce6811a65d40628bfc69d5eb9bcf01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa53c1a86-fb", "ovs_interfaceid": "a53c1a86-fb01-426e-9566-47e7ed07a37a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 01 14:34:25 compute-0 nova_compute[192698]: 2025-10-01 14:34:25.964 2 DEBUG nova.network.os_vif_util [None req-4dfc4b76-d977-4a4e-a672-8b36546d460a 2c2679cca0d247d1828f85f7ce3bb197 212993276c39412c938b179b82d692f2 - - default default] Converting VIF {"id": "a53c1a86-fb01-426e-9566-47e7ed07a37a", "address": "fa:16:3e:01:c0:62", "network": {"id": "555191a0-aa04-49d4-af46-93b0ff584e2d", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-742538066-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c0ce6811a65d40628bfc69d5eb9bcf01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa53c1a86-fb", "ovs_interfaceid": "a53c1a86-fb01-426e-9566-47e7ed07a37a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:34:25 compute-0 nova_compute[192698]: 2025-10-01 14:34:25.965 2 DEBUG nova.network.os_vif_util [None req-4dfc4b76-d977-4a4e-a672-8b36546d460a 2c2679cca0d247d1828f85f7ce3bb197 212993276c39412c938b179b82d692f2 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:01:c0:62,bridge_name='br-int',has_traffic_filtering=True,id=a53c1a86-fb01-426e-9566-47e7ed07a37a,network=Network(555191a0-aa04-49d4-af46-93b0ff584e2d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa53c1a86-fb') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:34:25 compute-0 nova_compute[192698]: 2025-10-01 14:34:25.968 2 DEBUG os_vif [None req-4dfc4b76-d977-4a4e-a672-8b36546d460a 2c2679cca0d247d1828f85f7ce3bb197 212993276c39412c938b179b82d692f2 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:01:c0:62,bridge_name='br-int',has_traffic_filtering=True,id=a53c1a86-fb01-426e-9566-47e7ed07a37a,network=Network(555191a0-aa04-49d4-af46-93b0ff584e2d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa53c1a86-fb') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 01 14:34:25 compute-0 nova_compute[192698]: 2025-10-01 14:34:25.971 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:34:25 compute-0 nova_compute[192698]: 2025-10-01 14:34:25.972 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa53c1a86-fb, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:34:25 compute-0 nova_compute[192698]: 2025-10-01 14:34:25.974 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:34:25 compute-0 nova_compute[192698]: 2025-10-01 14:34:25.976 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:34:25 compute-0 nova_compute[192698]: 2025-10-01 14:34:25.977 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:34:25 compute-0 nova_compute[192698]: 2025-10-01 14:34:25.977 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=7f76d115-ff6f-42d6-97a4-eda25b209b9d) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:34:25 compute-0 nova_compute[192698]: 2025-10-01 14:34:25.979 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:34:25 compute-0 nova_compute[192698]: 2025-10-01 14:34:25.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:34:25 compute-0 nova_compute[192698]: 2025-10-01 14:34:25.983 2 INFO os_vif [None req-4dfc4b76-d977-4a4e-a672-8b36546d460a 2c2679cca0d247d1828f85f7ce3bb197 212993276c39412c938b179b82d692f2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:01:c0:62,bridge_name='br-int',has_traffic_filtering=True,id=a53c1a86-fb01-426e-9566-47e7ed07a37a,network=Network(555191a0-aa04-49d4-af46-93b0ff584e2d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa53c1a86-fb')
Oct 01 14:34:25 compute-0 nova_compute[192698]: 2025-10-01 14:34:25.984 2 INFO nova.virt.libvirt.driver [None req-4dfc4b76-d977-4a4e-a672-8b36546d460a 2c2679cca0d247d1828f85f7ce3bb197 212993276c39412c938b179b82d692f2 - - default default] [instance: 9dcd8b44-4daa-408b-a130-bf9c003d0750] Deleting instance files /var/lib/nova/instances/9dcd8b44-4daa-408b-a130-bf9c003d0750_del
Oct 01 14:34:25 compute-0 nova_compute[192698]: 2025-10-01 14:34:25.985 2 INFO nova.virt.libvirt.driver [None req-4dfc4b76-d977-4a4e-a672-8b36546d460a 2c2679cca0d247d1828f85f7ce3bb197 212993276c39412c938b179b82d692f2 - - default default] [instance: 9dcd8b44-4daa-408b-a130-bf9c003d0750] Deletion of /var/lib/nova/instances/9dcd8b44-4daa-408b-a130-bf9c003d0750_del complete
Oct 01 14:34:26 compute-0 sshd-session[228570]: Failed password for root from 101.47.181.100 port 55612 ssh2
Oct 01 14:34:26 compute-0 nova_compute[192698]: 2025-10-01 14:34:26.499 2 INFO nova.compute.manager [None req-4dfc4b76-d977-4a4e-a672-8b36546d460a 2c2679cca0d247d1828f85f7ce3bb197 212993276c39412c938b179b82d692f2 - - default default] [instance: 9dcd8b44-4daa-408b-a130-bf9c003d0750] Took 1.32 seconds to destroy the instance on the hypervisor.
Oct 01 14:34:26 compute-0 nova_compute[192698]: 2025-10-01 14:34:26.500 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-4dfc4b76-d977-4a4e-a672-8b36546d460a 2c2679cca0d247d1828f85f7ce3bb197 212993276c39412c938b179b82d692f2 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 01 14:34:26 compute-0 nova_compute[192698]: 2025-10-01 14:34:26.501 2 DEBUG nova.compute.manager [-] [instance: 9dcd8b44-4daa-408b-a130-bf9c003d0750] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 01 14:34:26 compute-0 nova_compute[192698]: 2025-10-01 14:34:26.501 2 DEBUG nova.network.neutron [-] [instance: 9dcd8b44-4daa-408b-a130-bf9c003d0750] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 01 14:34:26 compute-0 nova_compute[192698]: 2025-10-01 14:34:26.501 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:34:26 compute-0 nova_compute[192698]: 2025-10-01 14:34:26.771 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:34:27 compute-0 sshd-session[228570]: Connection closed by authenticating user root 101.47.181.100 port 55612 [preauth]
Oct 01 14:34:27 compute-0 nova_compute[192698]: 2025-10-01 14:34:27.524 2 DEBUG nova.compute.manager [req-a7c166f4-c871-4dde-a5e2-acde7889104a req-bf59ea9e-1b05-4dcc-808a-f910388b94e4 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9dcd8b44-4daa-408b-a130-bf9c003d0750] Received event network-vif-unplugged-a53c1a86-fb01-426e-9566-47e7ed07a37a external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:34:27 compute-0 nova_compute[192698]: 2025-10-01 14:34:27.524 2 DEBUG oslo_concurrency.lockutils [req-a7c166f4-c871-4dde-a5e2-acde7889104a req-bf59ea9e-1b05-4dcc-808a-f910388b94e4 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "9dcd8b44-4daa-408b-a130-bf9c003d0750-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:34:27 compute-0 nova_compute[192698]: 2025-10-01 14:34:27.525 2 DEBUG oslo_concurrency.lockutils [req-a7c166f4-c871-4dde-a5e2-acde7889104a req-bf59ea9e-1b05-4dcc-808a-f910388b94e4 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "9dcd8b44-4daa-408b-a130-bf9c003d0750-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:34:27 compute-0 nova_compute[192698]: 2025-10-01 14:34:27.526 2 DEBUG oslo_concurrency.lockutils [req-a7c166f4-c871-4dde-a5e2-acde7889104a req-bf59ea9e-1b05-4dcc-808a-f910388b94e4 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "9dcd8b44-4daa-408b-a130-bf9c003d0750-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:34:27 compute-0 nova_compute[192698]: 2025-10-01 14:34:27.526 2 DEBUG nova.compute.manager [req-a7c166f4-c871-4dde-a5e2-acde7889104a req-bf59ea9e-1b05-4dcc-808a-f910388b94e4 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9dcd8b44-4daa-408b-a130-bf9c003d0750] No waiting events found dispatching network-vif-unplugged-a53c1a86-fb01-426e-9566-47e7ed07a37a pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:34:27 compute-0 nova_compute[192698]: 2025-10-01 14:34:27.527 2 DEBUG nova.compute.manager [req-a7c166f4-c871-4dde-a5e2-acde7889104a req-bf59ea9e-1b05-4dcc-808a-f910388b94e4 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9dcd8b44-4daa-408b-a130-bf9c003d0750] Received event network-vif-unplugged-a53c1a86-fb01-426e-9566-47e7ed07a37a for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 01 14:34:27 compute-0 nova_compute[192698]: 2025-10-01 14:34:27.527 2 DEBUG nova.compute.manager [req-a7c166f4-c871-4dde-a5e2-acde7889104a req-bf59ea9e-1b05-4dcc-808a-f910388b94e4 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9dcd8b44-4daa-408b-a130-bf9c003d0750] Received event network-vif-deleted-a53c1a86-fb01-426e-9566-47e7ed07a37a external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:34:27 compute-0 nova_compute[192698]: 2025-10-01 14:34:27.527 2 INFO nova.compute.manager [req-a7c166f4-c871-4dde-a5e2-acde7889104a req-bf59ea9e-1b05-4dcc-808a-f910388b94e4 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9dcd8b44-4daa-408b-a130-bf9c003d0750] Neutron deleted interface a53c1a86-fb01-426e-9566-47e7ed07a37a; detaching it from the instance and deleting it from the info cache
Oct 01 14:34:27 compute-0 nova_compute[192698]: 2025-10-01 14:34:27.528 2 DEBUG nova.network.neutron [req-a7c166f4-c871-4dde-a5e2-acde7889104a req-bf59ea9e-1b05-4dcc-808a-f910388b94e4 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9dcd8b44-4daa-408b-a130-bf9c003d0750] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:34:27 compute-0 nova_compute[192698]: 2025-10-01 14:34:27.535 2 DEBUG nova.network.neutron [-] [instance: 9dcd8b44-4daa-408b-a130-bf9c003d0750] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:34:28 compute-0 nova_compute[192698]: 2025-10-01 14:34:28.041 2 DEBUG nova.compute.manager [req-a7c166f4-c871-4dde-a5e2-acde7889104a req-bf59ea9e-1b05-4dcc-808a-f910388b94e4 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9dcd8b44-4daa-408b-a130-bf9c003d0750] Detach interface failed, port_id=a53c1a86-fb01-426e-9566-47e7ed07a37a, reason: Instance 9dcd8b44-4daa-408b-a130-bf9c003d0750 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Oct 01 14:34:28 compute-0 nova_compute[192698]: 2025-10-01 14:34:28.042 2 INFO nova.compute.manager [-] [instance: 9dcd8b44-4daa-408b-a130-bf9c003d0750] Took 1.54 seconds to deallocate network for instance.
Oct 01 14:34:28 compute-0 nova_compute[192698]: 2025-10-01 14:34:28.567 2 DEBUG oslo_concurrency.lockutils [None req-4dfc4b76-d977-4a4e-a672-8b36546d460a 2c2679cca0d247d1828f85f7ce3bb197 212993276c39412c938b179b82d692f2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:34:28 compute-0 nova_compute[192698]: 2025-10-01 14:34:28.568 2 DEBUG oslo_concurrency.lockutils [None req-4dfc4b76-d977-4a4e-a672-8b36546d460a 2c2679cca0d247d1828f85f7ce3bb197 212993276c39412c938b179b82d692f2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:34:28 compute-0 nova_compute[192698]: 2025-10-01 14:34:28.576 2 DEBUG oslo_concurrency.lockutils [None req-4dfc4b76-d977-4a4e-a672-8b36546d460a 2c2679cca0d247d1828f85f7ce3bb197 212993276c39412c938b179b82d692f2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.008s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:34:28 compute-0 nova_compute[192698]: 2025-10-01 14:34:28.616 2 INFO nova.scheduler.client.report [None req-4dfc4b76-d977-4a4e-a672-8b36546d460a 2c2679cca0d247d1828f85f7ce3bb197 212993276c39412c938b179b82d692f2 - - default default] Deleted allocations for instance 9dcd8b44-4daa-408b-a130-bf9c003d0750
Oct 01 14:34:29 compute-0 nova_compute[192698]: 2025-10-01 14:34:29.655 2 DEBUG oslo_concurrency.lockutils [None req-4dfc4b76-d977-4a4e-a672-8b36546d460a 2c2679cca0d247d1828f85f7ce3bb197 212993276c39412c938b179b82d692f2 - - default default] Lock "9dcd8b44-4daa-408b-a130-bf9c003d0750" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.013s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:34:29 compute-0 podman[203144]: time="2025-10-01T14:34:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:34:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:34:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:34:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:34:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3031 "" "Go-http-client/1.1"
Oct 01 14:34:30 compute-0 nova_compute[192698]: 2025-10-01 14:34:30.155 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:34:30 compute-0 nova_compute[192698]: 2025-10-01 14:34:30.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:34:31 compute-0 unix_chkpwd[228689]: password check failed for user (root)
Oct 01 14:34:31 compute-0 sshd-session[228687]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=101.47.181.100  user=root
Oct 01 14:34:31 compute-0 openstack_network_exporter[205307]: ERROR   14:34:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:34:31 compute-0 openstack_network_exporter[205307]: ERROR   14:34:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:34:31 compute-0 openstack_network_exporter[205307]: ERROR   14:34:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:34:31 compute-0 openstack_network_exporter[205307]: ERROR   14:34:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:34:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:34:31 compute-0 openstack_network_exporter[205307]: ERROR   14:34:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:34:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:34:33 compute-0 sshd-session[228687]: Failed password for root from 101.47.181.100 port 40828 ssh2
Oct 01 14:34:34 compute-0 sshd-session[228687]: Connection closed by authenticating user root 101.47.181.100 port 40828 [preauth]
Oct 01 14:34:34 compute-0 nova_compute[192698]: 2025-10-01 14:34:34.435 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:34:34 compute-0 nova_compute[192698]: 2025-10-01 14:34:34.436 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:34:34 compute-0 nova_compute[192698]: 2025-10-01 14:34:34.924 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:34:35 compute-0 nova_compute[192698]: 2025-10-01 14:34:35.160 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:34:35 compute-0 nova_compute[192698]: 2025-10-01 14:34:35.437 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:34:35 compute-0 nova_compute[192698]: 2025-10-01 14:34:35.437 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:34:35 compute-0 nova_compute[192698]: 2025-10-01 14:34:35.438 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:34:35 compute-0 nova_compute[192698]: 2025-10-01 14:34:35.438 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 01 14:34:35 compute-0 nova_compute[192698]: 2025-10-01 14:34:35.650 2 WARNING nova.virt.libvirt.driver [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:34:35 compute-0 nova_compute[192698]: 2025-10-01 14:34:35.651 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:34:35 compute-0 nova_compute[192698]: 2025-10-01 14:34:35.687 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.036s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:34:35 compute-0 nova_compute[192698]: 2025-10-01 14:34:35.688 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5788MB free_disk=73.302001953125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 01 14:34:35 compute-0 nova_compute[192698]: 2025-10-01 14:34:35.688 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:34:35 compute-0 nova_compute[192698]: 2025-10-01 14:34:35.688 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:34:35 compute-0 nova_compute[192698]: 2025-10-01 14:34:35.982 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:34:36 compute-0 nova_compute[192698]: 2025-10-01 14:34:36.767 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 01 14:34:36 compute-0 nova_compute[192698]: 2025-10-01 14:34:36.768 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:34:35 up  1:33,  0 user,  load average: 0.21, 0.16, 0.23\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 01 14:34:36 compute-0 nova_compute[192698]: 2025-10-01 14:34:36.806 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:34:37 compute-0 podman[228694]: 2025-10-01 14:34:37.171748272 +0000 UTC m=+0.077965909 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true)
Oct 01 14:34:37 compute-0 podman[228695]: 2025-10-01 14:34:37.202691665 +0000 UTC m=+0.113721042 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct 01 14:34:37 compute-0 nova_compute[192698]: 2025-10-01 14:34:37.316 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:34:37 compute-0 nova_compute[192698]: 2025-10-01 14:34:37.831 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 01 14:34:37 compute-0 nova_compute[192698]: 2025-10-01 14:34:37.831 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.143s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:34:38 compute-0 nova_compute[192698]: 2025-10-01 14:34:38.832 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:34:38 compute-0 nova_compute[192698]: 2025-10-01 14:34:38.833 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:34:38 compute-0 unix_chkpwd[228737]: password check failed for user (root)
Oct 01 14:34:38 compute-0 sshd-session[228690]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=101.47.181.100  user=root
Oct 01 14:34:40 compute-0 nova_compute[192698]: 2025-10-01 14:34:40.163 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:34:40 compute-0 sshd-session[228690]: Failed password for root from 101.47.181.100 port 40842 ssh2
Oct 01 14:34:40 compute-0 nova_compute[192698]: 2025-10-01 14:34:40.926 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:34:40 compute-0 nova_compute[192698]: 2025-10-01 14:34:40.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:34:41 compute-0 sshd-session[228690]: Connection closed by authenticating user root 101.47.181.100 port 40842 [preauth]
Oct 01 14:34:41 compute-0 nova_compute[192698]: 2025-10-01 14:34:41.915 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:34:42 compute-0 nova_compute[192698]: 2025-10-01 14:34:42.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:34:42 compute-0 nova_compute[192698]: 2025-10-01 14:34:42.926 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 01 14:34:43 compute-0 unix_chkpwd[228740]: password check failed for user (root)
Oct 01 14:34:43 compute-0 sshd-session[228738]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=101.47.181.100  user=root
Oct 01 14:34:45 compute-0 podman[228741]: 2025-10-01 14:34:45.144708016 +0000 UTC m=+0.056377569 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, release=1755695350, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, vcs-type=git, name=ubi9-minimal, container_name=openstack_network_exporter)
Oct 01 14:34:45 compute-0 nova_compute[192698]: 2025-10-01 14:34:45.164 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:34:45 compute-0 sshd-session[228738]: Failed password for root from 101.47.181.100 port 58632 ssh2
Oct 01 14:34:45 compute-0 nova_compute[192698]: 2025-10-01 14:34:45.987 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:34:46 compute-0 sshd-session[228738]: Connection closed by authenticating user root 101.47.181.100 port 58632 [preauth]
Oct 01 14:34:48 compute-0 unix_chkpwd[228765]: password check failed for user (root)
Oct 01 14:34:48 compute-0 sshd-session[228763]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=101.47.181.100  user=root
Oct 01 14:34:49 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:34:49.472 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=32, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'e2:3f:3c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:1d:a6:67:ed:e6'}, ipsec=False) old=SB_Global(nb_cfg=31) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:34:49 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:34:49.472 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 01 14:34:49 compute-0 nova_compute[192698]: 2025-10-01 14:34:49.474 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:34:50 compute-0 nova_compute[192698]: 2025-10-01 14:34:50.167 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:34:50 compute-0 podman[228768]: 2025-10-01 14:34:50.16969233 +0000 UTC m=+0.075583606 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Oct 01 14:34:50 compute-0 podman[228767]: 2025-10-01 14:34:50.170206063 +0000 UTC m=+0.076346966 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, container_name=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Oct 01 14:34:50 compute-0 sshd-session[228763]: Failed password for root from 101.47.181.100 port 50754 ssh2
Oct 01 14:34:50 compute-0 nova_compute[192698]: 2025-10-01 14:34:50.989 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:34:51 compute-0 sshd-session[228763]: Connection closed by authenticating user root 101.47.181.100 port 50754 [preauth]
Oct 01 14:34:52 compute-0 nova_compute[192698]: 2025-10-01 14:34:52.100 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:34:52 compute-0 unix_chkpwd[228813]: password check failed for user (root)
Oct 01 14:34:52 compute-0 sshd-session[228811]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=101.47.181.100  user=root
Oct 01 14:34:55 compute-0 sshd-session[228811]: Failed password for root from 101.47.181.100 port 50766 ssh2
Oct 01 14:34:55 compute-0 nova_compute[192698]: 2025-10-01 14:34:55.227 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:34:55 compute-0 nova_compute[192698]: 2025-10-01 14:34:55.991 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:34:56 compute-0 podman[228816]: 2025-10-01 14:34:56.169699527 +0000 UTC m=+0.081902715 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 01 14:34:56 compute-0 sshd-session[228811]: Connection closed by authenticating user root 101.47.181.100 port 50766 [preauth]
Oct 01 14:34:56 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:34:56.473 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=10cf9814-09fa-4bad-879a-270f9b64eda3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '32'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:34:58 compute-0 unix_chkpwd[228841]: password check failed for user (root)
Oct 01 14:34:58 compute-0 sshd-session[228815]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=101.47.181.100  user=root
Oct 01 14:34:59 compute-0 sshd-session[228815]: Failed password for root from 101.47.181.100 port 50774 ssh2
Oct 01 14:34:59 compute-0 podman[203144]: time="2025-10-01T14:34:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:34:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:34:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:34:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:34:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3031 "" "Go-http-client/1.1"
Oct 01 14:35:00 compute-0 nova_compute[192698]: 2025-10-01 14:35:00.263 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:35:00 compute-0 nova_compute[192698]: 2025-10-01 14:35:00.993 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:35:01 compute-0 sshd-session[228815]: Connection closed by authenticating user root 101.47.181.100 port 50774 [preauth]
Oct 01 14:35:01 compute-0 openstack_network_exporter[205307]: ERROR   14:35:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:35:01 compute-0 openstack_network_exporter[205307]: ERROR   14:35:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:35:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:35:01 compute-0 openstack_network_exporter[205307]: ERROR   14:35:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:35:01 compute-0 openstack_network_exporter[205307]: ERROR   14:35:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:35:01 compute-0 openstack_network_exporter[205307]: ERROR   14:35:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:35:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:35:02 compute-0 sshd[128084]: Timeout before authentication for connection from 101.47.181.100 to 38.102.83.163, pid = 228001
Oct 01 14:35:04 compute-0 unix_chkpwd[228844]: password check failed for user (root)
Oct 01 14:35:04 compute-0 sshd-session[228842]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=101.47.181.100  user=root
Oct 01 14:35:05 compute-0 nova_compute[192698]: 2025-10-01 14:35:05.267 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:35:05 compute-0 sshd-session[228842]: Failed password for root from 101.47.181.100 port 45250 ssh2
Oct 01 14:35:05 compute-0 nova_compute[192698]: 2025-10-01 14:35:05.995 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:35:07 compute-0 sshd[128084]: drop connection #1 from [101.47.181.100]:51634 on [38.102.83.163]:22 penalty: exceeded LoginGraceTime
Oct 01 14:35:08 compute-0 podman[228845]: 2025-10-01 14:35:08.192765734 +0000 UTC m=+0.093349723 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent)
Oct 01 14:35:08 compute-0 podman[228846]: 2025-10-01 14:35:08.23869159 +0000 UTC m=+0.135588320 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Oct 01 14:35:08 compute-0 sshd-session[228842]: Connection closed by authenticating user root 101.47.181.100 port 45250 [preauth]
Oct 01 14:35:10 compute-0 nova_compute[192698]: 2025-10-01 14:35:10.267 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:35:10 compute-0 nova_compute[192698]: 2025-10-01 14:35:10.997 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:35:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:35:14.304 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:35:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:35:14.304 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:35:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:35:14.305 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:35:15 compute-0 nova_compute[192698]: 2025-10-01 14:35:15.300 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:35:15 compute-0 podman[228893]: 2025-10-01 14:35:15.426045838 +0000 UTC m=+0.087541358 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, build-date=2025-08-20T13:12:41, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, distribution-scope=public)
Oct 01 14:35:16 compute-0 nova_compute[192698]: 2025-10-01 14:35:15.999 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:35:20 compute-0 nova_compute[192698]: 2025-10-01 14:35:20.346 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:35:21 compute-0 nova_compute[192698]: 2025-10-01 14:35:21.001 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:35:21 compute-0 podman[228914]: 2025-10-01 14:35:21.161451823 +0000 UTC m=+0.080093637 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct 01 14:35:21 compute-0 podman[228915]: 2025-10-01 14:35:21.170746453 +0000 UTC m=+0.086036807 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2)
Oct 01 14:35:25 compute-0 nova_compute[192698]: 2025-10-01 14:35:25.392 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:35:26 compute-0 nova_compute[192698]: 2025-10-01 14:35:26.003 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:35:26 compute-0 nova_compute[192698]: 2025-10-01 14:35:26.360 2 DEBUG nova.virt.libvirt.driver [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: d829b039-9b2c-46dc-9950-4e3c4fc9d308] Creating tmpfile /var/lib/nova/instances/tmp75f_371v to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Oct 01 14:35:26 compute-0 nova_compute[192698]: 2025-10-01 14:35:26.361 2 WARNING neutronclient.v2_0.client [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:35:26 compute-0 nova_compute[192698]: 2025-10-01 14:35:26.366 2 DEBUG nova.compute.manager [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp75f_371v',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Oct 01 14:35:27 compute-0 podman[228954]: 2025-10-01 14:35:27.158413508 +0000 UTC m=+0.075558075 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 01 14:35:28 compute-0 nova_compute[192698]: 2025-10-01 14:35:28.412 2 WARNING neutronclient.v2_0.client [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:35:29 compute-0 podman[203144]: time="2025-10-01T14:35:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:35:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:35:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:35:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:35:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3023 "" "Go-http-client/1.1"
Oct 01 14:35:30 compute-0 nova_compute[192698]: 2025-10-01 14:35:30.433 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:35:31 compute-0 nova_compute[192698]: 2025-10-01 14:35:31.006 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:35:31 compute-0 openstack_network_exporter[205307]: ERROR   14:35:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:35:31 compute-0 openstack_network_exporter[205307]: ERROR   14:35:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:35:31 compute-0 openstack_network_exporter[205307]: ERROR   14:35:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:35:31 compute-0 openstack_network_exporter[205307]: ERROR   14:35:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:35:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:35:31 compute-0 openstack_network_exporter[205307]: ERROR   14:35:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:35:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:35:32 compute-0 nova_compute[192698]: 2025-10-01 14:35:32.768 2 DEBUG nova.compute.manager [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp75f_371v',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='d829b039-9b2c-46dc-9950-4e3c4fc9d308',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Oct 01 14:35:33 compute-0 nova_compute[192698]: 2025-10-01 14:35:33.784 2 DEBUG oslo_concurrency.lockutils [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "refresh_cache-d829b039-9b2c-46dc-9950-4e3c4fc9d308" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 01 14:35:33 compute-0 nova_compute[192698]: 2025-10-01 14:35:33.785 2 DEBUG oslo_concurrency.lockutils [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquired lock "refresh_cache-d829b039-9b2c-46dc-9950-4e3c4fc9d308" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 01 14:35:33 compute-0 nova_compute[192698]: 2025-10-01 14:35:33.785 2 DEBUG nova.network.neutron [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: d829b039-9b2c-46dc-9950-4e3c4fc9d308] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 01 14:35:34 compute-0 nova_compute[192698]: 2025-10-01 14:35:34.291 2 WARNING neutronclient.v2_0.client [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:35:34 compute-0 nova_compute[192698]: 2025-10-01 14:35:34.434 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:35:34 compute-0 nova_compute[192698]: 2025-10-01 14:35:34.974 2 WARNING neutronclient.v2_0.client [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:35:35 compute-0 nova_compute[192698]: 2025-10-01 14:35:35.102 2 DEBUG nova.network.neutron [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: d829b039-9b2c-46dc-9950-4e3c4fc9d308] Updating instance_info_cache with network_info: [{"id": "4a475952-24c7-4c86-b7af-a18aef022152", "address": "fa:16:3e:8b:ea:d1", "network": {"id": "555191a0-aa04-49d4-af46-93b0ff584e2d", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-742538066-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c0ce6811a65d40628bfc69d5eb9bcf01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a475952-24", "ovs_interfaceid": "4a475952-24c7-4c86-b7af-a18aef022152", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:35:35 compute-0 nova_compute[192698]: 2025-10-01 14:35:35.435 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:35:35 compute-0 nova_compute[192698]: 2025-10-01 14:35:35.609 2 DEBUG oslo_concurrency.lockutils [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Releasing lock "refresh_cache-d829b039-9b2c-46dc-9950-4e3c4fc9d308" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 01 14:35:35 compute-0 nova_compute[192698]: 2025-10-01 14:35:35.623 2 DEBUG nova.virt.libvirt.driver [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: d829b039-9b2c-46dc-9950-4e3c4fc9d308] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp75f_371v',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='d829b039-9b2c-46dc-9950-4e3c4fc9d308',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Oct 01 14:35:35 compute-0 nova_compute[192698]: 2025-10-01 14:35:35.623 2 DEBUG nova.virt.libvirt.driver [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: d829b039-9b2c-46dc-9950-4e3c4fc9d308] Creating instance directory: /var/lib/nova/instances/d829b039-9b2c-46dc-9950-4e3c4fc9d308 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Oct 01 14:35:35 compute-0 nova_compute[192698]: 2025-10-01 14:35:35.624 2 DEBUG nova.virt.libvirt.driver [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: d829b039-9b2c-46dc-9950-4e3c4fc9d308] Creating disk.info with the contents: {'/var/lib/nova/instances/d829b039-9b2c-46dc-9950-4e3c4fc9d308/disk': 'qcow2', '/var/lib/nova/instances/d829b039-9b2c-46dc-9950-4e3c4fc9d308/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Oct 01 14:35:35 compute-0 nova_compute[192698]: 2025-10-01 14:35:35.624 2 DEBUG nova.virt.libvirt.driver [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: d829b039-9b2c-46dc-9950-4e3c4fc9d308] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Oct 01 14:35:35 compute-0 nova_compute[192698]: 2025-10-01 14:35:35.625 2 DEBUG nova.objects.instance [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lazy-loading 'trusted_certs' on Instance uuid d829b039-9b2c-46dc-9950-4e3c4fc9d308 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:35:35 compute-0 nova_compute[192698]: 2025-10-01 14:35:35.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:35:35 compute-0 nova_compute[192698]: 2025-10-01 14:35:35.926 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:35:36 compute-0 nova_compute[192698]: 2025-10-01 14:35:36.057 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:35:36 compute-0 nova_compute[192698]: 2025-10-01 14:35:36.131 2 DEBUG oslo_utils.imageutils.format_inspector [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:35:36 compute-0 nova_compute[192698]: 2025-10-01 14:35:36.138 2 DEBUG oslo_utils.imageutils.format_inspector [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:35:36 compute-0 nova_compute[192698]: 2025-10-01 14:35:36.140 2 DEBUG oslo_concurrency.processutils [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:35:36 compute-0 nova_compute[192698]: 2025-10-01 14:35:36.230 2 DEBUG oslo_concurrency.processutils [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:35:36 compute-0 nova_compute[192698]: 2025-10-01 14:35:36.231 2 DEBUG oslo_concurrency.lockutils [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "f477473ce09fdc00484ca839f539813eb2fee546" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:35:36 compute-0 nova_compute[192698]: 2025-10-01 14:35:36.232 2 DEBUG oslo_concurrency.lockutils [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "f477473ce09fdc00484ca839f539813eb2fee546" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:35:36 compute-0 nova_compute[192698]: 2025-10-01 14:35:36.232 2 DEBUG oslo_utils.imageutils.format_inspector [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:35:36 compute-0 nova_compute[192698]: 2025-10-01 14:35:36.236 2 DEBUG oslo_utils.imageutils.format_inspector [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:35:36 compute-0 nova_compute[192698]: 2025-10-01 14:35:36.237 2 DEBUG oslo_concurrency.processutils [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:35:36 compute-0 nova_compute[192698]: 2025-10-01 14:35:36.303 2 DEBUG oslo_concurrency.processutils [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:35:36 compute-0 nova_compute[192698]: 2025-10-01 14:35:36.307 2 DEBUG oslo_concurrency.processutils [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546,backing_fmt=raw /var/lib/nova/instances/d829b039-9b2c-46dc-9950-4e3c4fc9d308/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:35:36 compute-0 nova_compute[192698]: 2025-10-01 14:35:36.349 2 DEBUG oslo_concurrency.processutils [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546,backing_fmt=raw /var/lib/nova/instances/d829b039-9b2c-46dc-9950-4e3c4fc9d308/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:35:36 compute-0 nova_compute[192698]: 2025-10-01 14:35:36.351 2 DEBUG oslo_concurrency.lockutils [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "f477473ce09fdc00484ca839f539813eb2fee546" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.119s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:35:36 compute-0 nova_compute[192698]: 2025-10-01 14:35:36.351 2 DEBUG oslo_concurrency.processutils [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:35:36 compute-0 nova_compute[192698]: 2025-10-01 14:35:36.428 2 DEBUG oslo_concurrency.processutils [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:35:36 compute-0 nova_compute[192698]: 2025-10-01 14:35:36.430 2 DEBUG nova.virt.disk.api [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Checking if we can resize image /var/lib/nova/instances/d829b039-9b2c-46dc-9950-4e3c4fc9d308/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 01 14:35:36 compute-0 nova_compute[192698]: 2025-10-01 14:35:36.431 2 DEBUG oslo_concurrency.processutils [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d829b039-9b2c-46dc-9950-4e3c4fc9d308/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:35:36 compute-0 nova_compute[192698]: 2025-10-01 14:35:36.448 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:35:36 compute-0 nova_compute[192698]: 2025-10-01 14:35:36.450 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:35:36 compute-0 nova_compute[192698]: 2025-10-01 14:35:36.451 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:35:36 compute-0 nova_compute[192698]: 2025-10-01 14:35:36.451 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 01 14:35:36 compute-0 nova_compute[192698]: 2025-10-01 14:35:36.494 2 DEBUG oslo_concurrency.processutils [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d829b039-9b2c-46dc-9950-4e3c4fc9d308/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:35:36 compute-0 nova_compute[192698]: 2025-10-01 14:35:36.496 2 DEBUG nova.virt.disk.api [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Cannot resize image /var/lib/nova/instances/d829b039-9b2c-46dc-9950-4e3c4fc9d308/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 01 14:35:36 compute-0 nova_compute[192698]: 2025-10-01 14:35:36.497 2 DEBUG nova.objects.instance [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lazy-loading 'migration_context' on Instance uuid d829b039-9b2c-46dc-9950-4e3c4fc9d308 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:35:36 compute-0 nova_compute[192698]: 2025-10-01 14:35:36.688 2 WARNING nova.virt.libvirt.driver [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:35:36 compute-0 nova_compute[192698]: 2025-10-01 14:35:36.690 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:35:36 compute-0 nova_compute[192698]: 2025-10-01 14:35:36.735 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.045s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:35:36 compute-0 nova_compute[192698]: 2025-10-01 14:35:36.736 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5834MB free_disk=73.30176162719727GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 01 14:35:36 compute-0 nova_compute[192698]: 2025-10-01 14:35:36.736 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:35:36 compute-0 nova_compute[192698]: 2025-10-01 14:35:36.737 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:35:37 compute-0 nova_compute[192698]: 2025-10-01 14:35:37.007 2 DEBUG nova.objects.base [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Object Instance<d829b039-9b2c-46dc-9950-4e3c4fc9d308> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Oct 01 14:35:37 compute-0 nova_compute[192698]: 2025-10-01 14:35:37.008 2 DEBUG oslo_concurrency.processutils [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/d829b039-9b2c-46dc-9950-4e3c4fc9d308/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:35:37 compute-0 nova_compute[192698]: 2025-10-01 14:35:37.059 2 DEBUG oslo_concurrency.processutils [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/d829b039-9b2c-46dc-9950-4e3c4fc9d308/disk.config 497664" returned: 0 in 0.050s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:35:37 compute-0 nova_compute[192698]: 2025-10-01 14:35:37.060 2 DEBUG nova.virt.libvirt.driver [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: d829b039-9b2c-46dc-9950-4e3c4fc9d308] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Oct 01 14:35:37 compute-0 nova_compute[192698]: 2025-10-01 14:35:37.061 2 DEBUG nova.virt.libvirt.vif [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-10-01T14:34:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-1186792317',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(3),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-1186792317',id=32,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=3,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-01T14:34:50Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1151,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='212993276c39412c938b179b82d692f2',ramdisk_id='',reservation_id='r-4mr0e38m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-1252378341',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-1252378341-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-10-01T14:34:50Z,user_data=None,user_id='2c2679cca0d247d1828f85f7ce3bb197',uuid=d829b039-9b2c-46dc-9950-4e3c4fc9d308,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4a475952-24c7-4c86-b7af-a18aef022152", "address": "fa:16:3e:8b:ea:d1", "network": {"id": "555191a0-aa04-49d4-af46-93b0ff584e2d", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-742538066-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c0ce6811a65d40628bfc69d5eb9bcf01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap4a475952-24", "ovs_interfaceid": "4a475952-24c7-4c86-b7af-a18aef022152", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 01 14:35:37 compute-0 nova_compute[192698]: 2025-10-01 14:35:37.062 2 DEBUG nova.network.os_vif_util [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Converting VIF {"id": "4a475952-24c7-4c86-b7af-a18aef022152", "address": "fa:16:3e:8b:ea:d1", "network": {"id": "555191a0-aa04-49d4-af46-93b0ff584e2d", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-742538066-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c0ce6811a65d40628bfc69d5eb9bcf01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap4a475952-24", "ovs_interfaceid": "4a475952-24c7-4c86-b7af-a18aef022152", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:35:37 compute-0 nova_compute[192698]: 2025-10-01 14:35:37.063 2 DEBUG nova.network.os_vif_util [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8b:ea:d1,bridge_name='br-int',has_traffic_filtering=True,id=4a475952-24c7-4c86-b7af-a18aef022152,network=Network(555191a0-aa04-49d4-af46-93b0ff584e2d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a475952-24') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:35:37 compute-0 nova_compute[192698]: 2025-10-01 14:35:37.063 2 DEBUG os_vif [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:ea:d1,bridge_name='br-int',has_traffic_filtering=True,id=4a475952-24c7-4c86-b7af-a18aef022152,network=Network(555191a0-aa04-49d4-af46-93b0ff584e2d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a475952-24') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 01 14:35:37 compute-0 nova_compute[192698]: 2025-10-01 14:35:37.064 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:35:37 compute-0 nova_compute[192698]: 2025-10-01 14:35:37.065 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:35:37 compute-0 nova_compute[192698]: 2025-10-01 14:35:37.065 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:35:37 compute-0 nova_compute[192698]: 2025-10-01 14:35:37.066 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:35:37 compute-0 nova_compute[192698]: 2025-10-01 14:35:37.066 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '298e95da-00c6-5e4e-8f83-942db39e681e', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:35:37 compute-0 nova_compute[192698]: 2025-10-01 14:35:37.071 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:35:37 compute-0 nova_compute[192698]: 2025-10-01 14:35:37.076 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:35:37 compute-0 nova_compute[192698]: 2025-10-01 14:35:37.076 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4a475952-24, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:35:37 compute-0 nova_compute[192698]: 2025-10-01 14:35:37.077 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap4a475952-24, col_values=(('qos', UUID('66db463c-6e91-47bc-9e16-02d02f575b3a')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:35:37 compute-0 nova_compute[192698]: 2025-10-01 14:35:37.078 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap4a475952-24, col_values=(('external_ids', {'iface-id': '4a475952-24c7-4c86-b7af-a18aef022152', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8b:ea:d1', 'vm-uuid': 'd829b039-9b2c-46dc-9950-4e3c4fc9d308'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:35:37 compute-0 nova_compute[192698]: 2025-10-01 14:35:37.080 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:35:37 compute-0 NetworkManager[51741]: <info>  [1759329337.0822] manager: (tap4a475952-24): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/96)
Oct 01 14:35:37 compute-0 nova_compute[192698]: 2025-10-01 14:35:37.084 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:35:37 compute-0 nova_compute[192698]: 2025-10-01 14:35:37.089 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:35:37 compute-0 nova_compute[192698]: 2025-10-01 14:35:37.090 2 INFO os_vif [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:ea:d1,bridge_name='br-int',has_traffic_filtering=True,id=4a475952-24c7-4c86-b7af-a18aef022152,network=Network(555191a0-aa04-49d4-af46-93b0ff584e2d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a475952-24')
Oct 01 14:35:37 compute-0 nova_compute[192698]: 2025-10-01 14:35:37.091 2 DEBUG nova.virt.libvirt.driver [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Oct 01 14:35:37 compute-0 nova_compute[192698]: 2025-10-01 14:35:37.091 2 DEBUG nova.compute.manager [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp75f_371v',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='d829b039-9b2c-46dc-9950-4e3c4fc9d308',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Oct 01 14:35:37 compute-0 nova_compute[192698]: 2025-10-01 14:35:37.093 2 WARNING neutronclient.v2_0.client [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:35:37 compute-0 nova_compute[192698]: 2025-10-01 14:35:37.214 2 WARNING neutronclient.v2_0.client [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:35:37 compute-0 nova_compute[192698]: 2025-10-01 14:35:37.758 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Migration for instance d829b039-9b2c-46dc-9950-4e3c4fc9d308 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Oct 01 14:35:38 compute-0 nova_compute[192698]: 2025-10-01 14:35:38.003 2 DEBUG nova.network.neutron [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: d829b039-9b2c-46dc-9950-4e3c4fc9d308] Port 4a475952-24c7-4c86-b7af-a18aef022152 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Oct 01 14:35:38 compute-0 nova_compute[192698]: 2025-10-01 14:35:38.017 2 DEBUG nova.compute.manager [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp75f_371v',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='d829b039-9b2c-46dc-9950-4e3c4fc9d308',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Oct 01 14:35:38 compute-0 nova_compute[192698]: 2025-10-01 14:35:38.265 2 INFO nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] [instance: d829b039-9b2c-46dc-9950-4e3c4fc9d308] Updating resource usage from migration dade0015-be7c-4203-965c-4d9180c7ff51
Oct 01 14:35:38 compute-0 nova_compute[192698]: 2025-10-01 14:35:38.266 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] [instance: d829b039-9b2c-46dc-9950-4e3c4fc9d308] Starting to track incoming migration dade0015-be7c-4203-965c-4d9180c7ff51 with flavor b86137c0-c1b7-45a5-9778-6d64c7367f5a _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Oct 01 14:35:39 compute-0 ovn_controller[94909]: 2025-10-01T14:35:39Z|00259|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Oct 01 14:35:39 compute-0 podman[229000]: 2025-10-01 14:35:39.197676322 +0000 UTC m=+0.094614128 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, io.buildah.version=1.41.4)
Oct 01 14:35:39 compute-0 podman[229001]: 2025-10-01 14:35:39.214448723 +0000 UTC m=+0.112479838 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0)
Oct 01 14:35:39 compute-0 nova_compute[192698]: 2025-10-01 14:35:39.309 2 WARNING nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Instance d829b039-9b2c-46dc-9950-4e3c4fc9d308 has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 1151, 'VCPU': 1}}.
Oct 01 14:35:39 compute-0 nova_compute[192698]: 2025-10-01 14:35:39.310 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 01 14:35:39 compute-0 nova_compute[192698]: 2025-10-01 14:35:39.310 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=1663MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:35:36 up  1:34,  0 user,  load average: 0.08, 0.13, 0.21\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 01 14:35:39 compute-0 nova_compute[192698]: 2025-10-01 14:35:39.364 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:35:39 compute-0 nova_compute[192698]: 2025-10-01 14:35:39.873 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:35:40 compute-0 nova_compute[192698]: 2025-10-01 14:35:40.386 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 01 14:35:40 compute-0 nova_compute[192698]: 2025-10-01 14:35:40.387 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.650s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:35:40 compute-0 nova_compute[192698]: 2025-10-01 14:35:40.437 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:35:41 compute-0 nova_compute[192698]: 2025-10-01 14:35:41.387 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:35:41 compute-0 nova_compute[192698]: 2025-10-01 14:35:41.389 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:35:41 compute-0 kernel: tap4a475952-24: entered promiscuous mode
Oct 01 14:35:41 compute-0 NetworkManager[51741]: <info>  [1759329341.6788] manager: (tap4a475952-24): new Tun device (/org/freedesktop/NetworkManager/Devices/97)
Oct 01 14:35:41 compute-0 ovn_controller[94909]: 2025-10-01T14:35:41Z|00260|binding|INFO|Claiming lport 4a475952-24c7-4c86-b7af-a18aef022152 for this additional chassis.
Oct 01 14:35:41 compute-0 ovn_controller[94909]: 2025-10-01T14:35:41Z|00261|binding|INFO|4a475952-24c7-4c86-b7af-a18aef022152: Claiming fa:16:3e:8b:ea:d1 10.100.0.11
Oct 01 14:35:41 compute-0 nova_compute[192698]: 2025-10-01 14:35:41.680 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:35:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:35:41.689 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8b:ea:d1 10.100.0.11'], port_security=['fa:16:3e:8b:ea:d1 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'd829b039-9b2c-46dc-9950-4e3c4fc9d308', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-555191a0-aa04-49d4-af46-93b0ff584e2d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '212993276c39412c938b179b82d692f2', 'neutron:revision_number': '10', 'neutron:security_group_ids': '134d03e5-3921-4f89-b7a4-e938969031c7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3857fb1d-1e54-4a99-a828-506b6bdd5885, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=4a475952-24c7-4c86-b7af-a18aef022152) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:35:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:35:41.691 103791 INFO neutron.agent.ovn.metadata.agent [-] Port 4a475952-24c7-4c86-b7af-a18aef022152 in datapath 555191a0-aa04-49d4-af46-93b0ff584e2d unbound from our chassis
Oct 01 14:35:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:35:41.692 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 555191a0-aa04-49d4-af46-93b0ff584e2d
Oct 01 14:35:41 compute-0 ovn_controller[94909]: 2025-10-01T14:35:41Z|00262|binding|INFO|Setting lport 4a475952-24c7-4c86-b7af-a18aef022152 ovn-installed in OVS
Oct 01 14:35:41 compute-0 nova_compute[192698]: 2025-10-01 14:35:41.707 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:35:41 compute-0 nova_compute[192698]: 2025-10-01 14:35:41.712 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:35:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:35:41.713 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[ad56faef-c337-4d1d-abd8-b9a438c5f1fe]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:35:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:35:41.714 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap555191a0-a1 in ovnmeta-555191a0-aa04-49d4-af46-93b0ff584e2d namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Oct 01 14:35:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:35:41.717 214114 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap555191a0-a0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Oct 01 14:35:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:35:41.717 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[25327dc9-d461-438b-9109-8426d5f4a2bc]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:35:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:35:41.718 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[28edce59-aeac-4a38-8b63-33386ac6df0a]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:35:41 compute-0 systemd-udevd[229060]: Network interface NamePolicy= disabled on kernel command line.
Oct 01 14:35:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:35:41.737 103910 DEBUG oslo.privsep.daemon [-] privsep: reply[6c4d8cea-bb4e-4249-8ce0-7df2fa045332]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:35:41 compute-0 NetworkManager[51741]: <info>  [1759329341.7447] device (tap4a475952-24): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 01 14:35:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:35:41.747 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[ac58c4d8-7831-4930-a775-920326d492bd]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:35:41 compute-0 NetworkManager[51741]: <info>  [1759329341.7484] device (tap4a475952-24): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 01 14:35:41 compute-0 systemd-machined[152704]: New machine qemu-25-instance-00000020.
Oct 01 14:35:41 compute-0 systemd[1]: Started Virtual Machine qemu-25-instance-00000020.
Oct 01 14:35:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:35:41.789 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[19ab07ba-7ae7-4952-a12d-eccf912edfdf]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:35:41 compute-0 systemd-udevd[229067]: Network interface NamePolicy= disabled on kernel command line.
Oct 01 14:35:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:35:41.795 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[a1ac4386-12b8-4b7f-816a-e97e5cb387d0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:35:41 compute-0 NetworkManager[51741]: <info>  [1759329341.7973] manager: (tap555191a0-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/98)
Oct 01 14:35:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:35:41.843 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[39beb6c6-379c-426c-b3bc-82e7c9eebf0a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:35:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:35:41.846 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[a1570d39-6ff6-4316-bfaf-2aeeff614f29]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:35:41 compute-0 NetworkManager[51741]: <info>  [1759329341.8748] device (tap555191a0-a0): carrier: link connected
Oct 01 14:35:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:35:41.886 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[0812ac8c-d96f-4910-9b83-5e4cfbdefb39]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:35:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:35:41.907 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[c656909e-404d-4db8-8732-df7d7e4f3bc2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap555191a0-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f5:d1:eb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 75], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 570240, 'reachable_time': 25357, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 192, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 192, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229095, 'error': None, 'target': 'ovnmeta-555191a0-aa04-49d4-af46-93b0ff584e2d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:35:41 compute-0 nova_compute[192698]: 2025-10-01 14:35:41.915 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:35:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:35:41.924 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[dac0e511-4572-4114-af76-b9b0991dcd5a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef5:d1eb'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 570240, 'tstamp': 570240}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229096, 'error': None, 'target': 'ovnmeta-555191a0-aa04-49d4-af46-93b0ff584e2d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:35:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:35:41.944 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[8e327258-4dee-4545-8ce3-2b8ee7d1259d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap555191a0-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f5:d1:eb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 75], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 570240, 'reachable_time': 25357, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 192, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 192, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 229097, 'error': None, 'target': 'ovnmeta-555191a0-aa04-49d4-af46-93b0ff584e2d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:35:41 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:35:41.992 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[5f95d9d1-1ed3-4197-9377-915a52bf6833]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:35:42 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:35:42.069 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[40d5642d-9a83-4147-9f63-0439895524ec]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:35:42 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:35:42.071 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap555191a0-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:35:42 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:35:42.071 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:35:42 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:35:42.072 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap555191a0-a0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:35:42 compute-0 nova_compute[192698]: 2025-10-01 14:35:42.074 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:35:42 compute-0 NetworkManager[51741]: <info>  [1759329342.0761] manager: (tap555191a0-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/99)
Oct 01 14:35:42 compute-0 kernel: tap555191a0-a0: entered promiscuous mode
Oct 01 14:35:42 compute-0 nova_compute[192698]: 2025-10-01 14:35:42.080 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:35:42 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:35:42.091 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap555191a0-a0, col_values=(('external_ids', {'iface-id': '0a30025f-995f-4cd9-a2bb-85c8c480de92'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:35:42 compute-0 nova_compute[192698]: 2025-10-01 14:35:42.093 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:35:42 compute-0 ovn_controller[94909]: 2025-10-01T14:35:42Z|00263|binding|INFO|Releasing lport 0a30025f-995f-4cd9-a2bb-85c8c480de92 from this chassis (sb_readonly=0)
Oct 01 14:35:42 compute-0 nova_compute[192698]: 2025-10-01 14:35:42.094 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:35:42 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:35:42.108 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[9bef83eb-fabc-430d-9e07-e4744312e8ec]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:35:42 compute-0 nova_compute[192698]: 2025-10-01 14:35:42.110 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:35:42 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:35:42.111 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/555191a0-aa04-49d4-af46-93b0ff584e2d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/555191a0-aa04-49d4-af46-93b0ff584e2d.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:35:42 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:35:42.111 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/555191a0-aa04-49d4-af46-93b0ff584e2d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/555191a0-aa04-49d4-af46-93b0ff584e2d.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:35:42 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:35:42.111 103791 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 555191a0-aa04-49d4-af46-93b0ff584e2d disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Oct 01 14:35:42 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:35:42.111 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/555191a0-aa04-49d4-af46-93b0ff584e2d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/555191a0-aa04-49d4-af46-93b0ff584e2d.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:35:42 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:35:42.112 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[2cd5bdf2-5774-4da2-9a97-5119456d10f5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:35:42 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:35:42.112 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/555191a0-aa04-49d4-af46-93b0ff584e2d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/555191a0-aa04-49d4-af46-93b0ff584e2d.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:35:42 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:35:42.113 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[c53fe837-83d8-4f1f-934f-6510ad3b4f0d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:35:42 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:35:42.113 103791 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Oct 01 14:35:42 compute-0 ovn_metadata_agent[103777]: global
Oct 01 14:35:42 compute-0 ovn_metadata_agent[103777]:     log         /dev/log local0 debug
Oct 01 14:35:42 compute-0 ovn_metadata_agent[103777]:     log-tag     haproxy-metadata-proxy-555191a0-aa04-49d4-af46-93b0ff584e2d
Oct 01 14:35:42 compute-0 ovn_metadata_agent[103777]:     user        root
Oct 01 14:35:42 compute-0 ovn_metadata_agent[103777]:     group       root
Oct 01 14:35:42 compute-0 ovn_metadata_agent[103777]:     maxconn     1024
Oct 01 14:35:42 compute-0 ovn_metadata_agent[103777]:     pidfile     /var/lib/neutron/external/pids/555191a0-aa04-49d4-af46-93b0ff584e2d.pid.haproxy
Oct 01 14:35:42 compute-0 ovn_metadata_agent[103777]:     daemon
Oct 01 14:35:42 compute-0 ovn_metadata_agent[103777]: 
Oct 01 14:35:42 compute-0 ovn_metadata_agent[103777]: defaults
Oct 01 14:35:42 compute-0 ovn_metadata_agent[103777]:     log global
Oct 01 14:35:42 compute-0 ovn_metadata_agent[103777]:     mode http
Oct 01 14:35:42 compute-0 ovn_metadata_agent[103777]:     option httplog
Oct 01 14:35:42 compute-0 ovn_metadata_agent[103777]:     option dontlognull
Oct 01 14:35:42 compute-0 ovn_metadata_agent[103777]:     option http-server-close
Oct 01 14:35:42 compute-0 ovn_metadata_agent[103777]:     option forwardfor
Oct 01 14:35:42 compute-0 ovn_metadata_agent[103777]:     retries                 3
Oct 01 14:35:42 compute-0 ovn_metadata_agent[103777]:     timeout http-request    30s
Oct 01 14:35:42 compute-0 ovn_metadata_agent[103777]:     timeout connect         30s
Oct 01 14:35:42 compute-0 ovn_metadata_agent[103777]:     timeout client          32s
Oct 01 14:35:42 compute-0 ovn_metadata_agent[103777]:     timeout server          32s
Oct 01 14:35:42 compute-0 ovn_metadata_agent[103777]:     timeout http-keep-alive 30s
Oct 01 14:35:42 compute-0 ovn_metadata_agent[103777]: 
Oct 01 14:35:42 compute-0 ovn_metadata_agent[103777]: listen listener
Oct 01 14:35:42 compute-0 ovn_metadata_agent[103777]:     bind 169.254.169.254:80
Oct 01 14:35:42 compute-0 ovn_metadata_agent[103777]:     
Oct 01 14:35:42 compute-0 ovn_metadata_agent[103777]:     server metadata /var/lib/neutron/metadata_proxy
Oct 01 14:35:42 compute-0 ovn_metadata_agent[103777]: 
Oct 01 14:35:42 compute-0 ovn_metadata_agent[103777]:     http-request add-header X-OVN-Network-ID 555191a0-aa04-49d4-af46-93b0ff584e2d
Oct 01 14:35:42 compute-0 ovn_metadata_agent[103777]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Oct 01 14:35:42 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:35:42.114 103791 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-555191a0-aa04-49d4-af46-93b0ff584e2d', 'env', 'PROCESS_TAG=haproxy-555191a0-aa04-49d4-af46-93b0ff584e2d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/555191a0-aa04-49d4-af46-93b0ff584e2d.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Oct 01 14:35:42 compute-0 podman[229136]: 2025-10-01 14:35:42.553125598 +0000 UTC m=+0.084582668 container create c11575c41db10beee7e530981a62348a4915f059de6d8f4c712af9bb8f405bfa (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-555191a0-aa04-49d4-af46-93b0ff584e2d, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest)
Oct 01 14:35:42 compute-0 podman[229136]: 2025-10-01 14:35:42.510834659 +0000 UTC m=+0.042291830 image pull 0c139338a67144a0d88e07ef5f38b20d3085af4a1586fd8115d3776c8f9c633c 38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Oct 01 14:35:42 compute-0 systemd[1]: Started libpod-conmon-c11575c41db10beee7e530981a62348a4915f059de6d8f4c712af9bb8f405bfa.scope.
Oct 01 14:35:42 compute-0 systemd[1]: Started libcrun container.
Oct 01 14:35:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ca92e2dc1f5bcdd23d6c5c5395cdab9447dc235081d0383da8b9b7aec0a8512a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 01 14:35:42 compute-0 podman[229136]: 2025-10-01 14:35:42.66541847 +0000 UTC m=+0.196875570 container init c11575c41db10beee7e530981a62348a4915f059de6d8f4c712af9bb8f405bfa (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-555191a0-aa04-49d4-af46-93b0ff584e2d, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2)
Oct 01 14:35:42 compute-0 podman[229136]: 2025-10-01 14:35:42.678963595 +0000 UTC m=+0.210420665 container start c11575c41db10beee7e530981a62348a4915f059de6d8f4c712af9bb8f405bfa (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-555191a0-aa04-49d4-af46-93b0ff584e2d, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS)
Oct 01 14:35:42 compute-0 neutron-haproxy-ovnmeta-555191a0-aa04-49d4-af46-93b0ff584e2d[229151]: [NOTICE]   (229155) : New worker (229157) forked
Oct 01 14:35:42 compute-0 neutron-haproxy-ovnmeta-555191a0-aa04-49d4-af46-93b0ff584e2d[229151]: [NOTICE]   (229155) : Loading success.
Oct 01 14:35:42 compute-0 nova_compute[192698]: 2025-10-01 14:35:42.914 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:35:43 compute-0 nova_compute[192698]: 2025-10-01 14:35:43.425 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:35:43 compute-0 nova_compute[192698]: 2025-10-01 14:35:43.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:35:43 compute-0 nova_compute[192698]: 2025-10-01 14:35:43.926 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 01 14:35:45 compute-0 ovn_controller[94909]: 2025-10-01T14:35:45Z|00264|binding|INFO|Claiming lport 4a475952-24c7-4c86-b7af-a18aef022152 for this chassis.
Oct 01 14:35:45 compute-0 ovn_controller[94909]: 2025-10-01T14:35:45Z|00265|binding|INFO|4a475952-24c7-4c86-b7af-a18aef022152: Claiming fa:16:3e:8b:ea:d1 10.100.0.11
Oct 01 14:35:45 compute-0 ovn_controller[94909]: 2025-10-01T14:35:45Z|00266|binding|INFO|Setting lport 4a475952-24c7-4c86-b7af-a18aef022152 up in Southbound
Oct 01 14:35:45 compute-0 nova_compute[192698]: 2025-10-01 14:35:45.440 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:35:46 compute-0 podman[229181]: 2025-10-01 14:35:46.157758672 +0000 UTC m=+0.071811424 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, config_id=edpm, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers)
Oct 01 14:35:47 compute-0 nova_compute[192698]: 2025-10-01 14:35:47.084 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:35:47 compute-0 nova_compute[192698]: 2025-10-01 14:35:47.975 2 INFO nova.compute.manager [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: d829b039-9b2c-46dc-9950-4e3c4fc9d308] Post operation of migration started
Oct 01 14:35:47 compute-0 nova_compute[192698]: 2025-10-01 14:35:47.976 2 WARNING neutronclient.v2_0.client [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:35:48 compute-0 nova_compute[192698]: 2025-10-01 14:35:48.644 2 WARNING neutronclient.v2_0.client [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:35:48 compute-0 nova_compute[192698]: 2025-10-01 14:35:48.645 2 WARNING neutronclient.v2_0.client [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:35:48 compute-0 nova_compute[192698]: 2025-10-01 14:35:48.767 2 DEBUG oslo_concurrency.lockutils [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "refresh_cache-d829b039-9b2c-46dc-9950-4e3c4fc9d308" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 01 14:35:48 compute-0 nova_compute[192698]: 2025-10-01 14:35:48.768 2 DEBUG oslo_concurrency.lockutils [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquired lock "refresh_cache-d829b039-9b2c-46dc-9950-4e3c4fc9d308" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 01 14:35:48 compute-0 nova_compute[192698]: 2025-10-01 14:35:48.768 2 DEBUG nova.network.neutron [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: d829b039-9b2c-46dc-9950-4e3c4fc9d308] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 01 14:35:49 compute-0 nova_compute[192698]: 2025-10-01 14:35:49.275 2 WARNING neutronclient.v2_0.client [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:35:49 compute-0 nova_compute[192698]: 2025-10-01 14:35:49.725 2 WARNING neutronclient.v2_0.client [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:35:49 compute-0 nova_compute[192698]: 2025-10-01 14:35:49.866 2 DEBUG nova.network.neutron [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: d829b039-9b2c-46dc-9950-4e3c4fc9d308] Updating instance_info_cache with network_info: [{"id": "4a475952-24c7-4c86-b7af-a18aef022152", "address": "fa:16:3e:8b:ea:d1", "network": {"id": "555191a0-aa04-49d4-af46-93b0ff584e2d", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-742538066-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c0ce6811a65d40628bfc69d5eb9bcf01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a475952-24", "ovs_interfaceid": "4a475952-24c7-4c86-b7af-a18aef022152", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:35:50 compute-0 nova_compute[192698]: 2025-10-01 14:35:50.372 2 DEBUG oslo_concurrency.lockutils [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Releasing lock "refresh_cache-d829b039-9b2c-46dc-9950-4e3c4fc9d308" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 01 14:35:50 compute-0 nova_compute[192698]: 2025-10-01 14:35:50.442 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:35:50 compute-0 nova_compute[192698]: 2025-10-01 14:35:50.891 2 DEBUG oslo_concurrency.lockutils [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:35:50 compute-0 nova_compute[192698]: 2025-10-01 14:35:50.892 2 DEBUG oslo_concurrency.lockutils [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:35:50 compute-0 nova_compute[192698]: 2025-10-01 14:35:50.892 2 DEBUG oslo_concurrency.lockutils [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:35:50 compute-0 nova_compute[192698]: 2025-10-01 14:35:50.897 2 INFO nova.virt.libvirt.driver [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: d829b039-9b2c-46dc-9950-4e3c4fc9d308] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Oct 01 14:35:50 compute-0 virtqemud[192597]: Domain id=25 name='instance-00000020' uuid=d829b039-9b2c-46dc-9950-4e3c4fc9d308 is tainted: custom-monitor
Oct 01 14:35:51 compute-0 nova_compute[192698]: 2025-10-01 14:35:51.906 2 INFO nova.virt.libvirt.driver [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: d829b039-9b2c-46dc-9950-4e3c4fc9d308] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Oct 01 14:35:52 compute-0 nova_compute[192698]: 2025-10-01 14:35:52.086 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:35:52 compute-0 podman[229203]: 2025-10-01 14:35:52.14592528 +0000 UTC m=+0.057282173 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, container_name=iscsid, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 01 14:35:52 compute-0 podman[229204]: 2025-10-01 14:35:52.154673916 +0000 UTC m=+0.064940269 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_build_tag=watcher_latest, container_name=multipathd, io.buildah.version=1.41.4)
Oct 01 14:35:52 compute-0 nova_compute[192698]: 2025-10-01 14:35:52.913 2 INFO nova.virt.libvirt.driver [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: d829b039-9b2c-46dc-9950-4e3c4fc9d308] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Oct 01 14:35:52 compute-0 nova_compute[192698]: 2025-10-01 14:35:52.918 2 DEBUG nova.compute.manager [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: d829b039-9b2c-46dc-9950-4e3c4fc9d308] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 01 14:35:53 compute-0 nova_compute[192698]: 2025-10-01 14:35:53.431 2 DEBUG nova.objects.instance [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: d829b039-9b2c-46dc-9950-4e3c4fc9d308] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Oct 01 14:35:54 compute-0 nova_compute[192698]: 2025-10-01 14:35:54.451 2 WARNING neutronclient.v2_0.client [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:35:54 compute-0 nova_compute[192698]: 2025-10-01 14:35:54.639 2 WARNING neutronclient.v2_0.client [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:35:54 compute-0 nova_compute[192698]: 2025-10-01 14:35:54.640 2 WARNING neutronclient.v2_0.client [None req-cc7df69a-3eba-42cf-96b7-0e4c9ac7d24a a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:35:55 compute-0 nova_compute[192698]: 2025-10-01 14:35:55.446 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:35:57 compute-0 nova_compute[192698]: 2025-10-01 14:35:57.089 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:35:58 compute-0 podman[229243]: 2025-10-01 14:35:58.181804984 +0000 UTC m=+0.087566998 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 01 14:35:59 compute-0 nova_compute[192698]: 2025-10-01 14:35:59.529 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:35:59 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:35:59.529 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=33, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'e2:3f:3c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:1d:a6:67:ed:e6'}, ipsec=False) old=SB_Global(nb_cfg=32) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:35:59 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:35:59.530 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 01 14:35:59 compute-0 podman[203144]: time="2025-10-01T14:35:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:35:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:35:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20750 "" "Go-http-client/1.1"
Oct 01 14:35:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:35:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3494 "" "Go-http-client/1.1"
Oct 01 14:36:00 compute-0 nova_compute[192698]: 2025-10-01 14:36:00.449 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:36:01 compute-0 openstack_network_exporter[205307]: ERROR   14:36:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:36:01 compute-0 openstack_network_exporter[205307]: ERROR   14:36:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:36:01 compute-0 openstack_network_exporter[205307]: ERROR   14:36:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:36:01 compute-0 openstack_network_exporter[205307]: ERROR   14:36:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:36:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:36:01 compute-0 openstack_network_exporter[205307]: ERROR   14:36:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:36:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:36:02 compute-0 nova_compute[192698]: 2025-10-01 14:36:02.092 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:36:04 compute-0 nova_compute[192698]: 2025-10-01 14:36:04.139 2 DEBUG oslo_concurrency.lockutils [None req-37ff54c6-b650-4dfe-9154-4babd5e3fad0 2c2679cca0d247d1828f85f7ce3bb197 212993276c39412c938b179b82d692f2 - - default default] Acquiring lock "d829b039-9b2c-46dc-9950-4e3c4fc9d308" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:36:04 compute-0 nova_compute[192698]: 2025-10-01 14:36:04.140 2 DEBUG oslo_concurrency.lockutils [None req-37ff54c6-b650-4dfe-9154-4babd5e3fad0 2c2679cca0d247d1828f85f7ce3bb197 212993276c39412c938b179b82d692f2 - - default default] Lock "d829b039-9b2c-46dc-9950-4e3c4fc9d308" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:36:04 compute-0 nova_compute[192698]: 2025-10-01 14:36:04.141 2 DEBUG oslo_concurrency.lockutils [None req-37ff54c6-b650-4dfe-9154-4babd5e3fad0 2c2679cca0d247d1828f85f7ce3bb197 212993276c39412c938b179b82d692f2 - - default default] Acquiring lock "d829b039-9b2c-46dc-9950-4e3c4fc9d308-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:36:04 compute-0 nova_compute[192698]: 2025-10-01 14:36:04.141 2 DEBUG oslo_concurrency.lockutils [None req-37ff54c6-b650-4dfe-9154-4babd5e3fad0 2c2679cca0d247d1828f85f7ce3bb197 212993276c39412c938b179b82d692f2 - - default default] Lock "d829b039-9b2c-46dc-9950-4e3c4fc9d308-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:36:04 compute-0 nova_compute[192698]: 2025-10-01 14:36:04.142 2 DEBUG oslo_concurrency.lockutils [None req-37ff54c6-b650-4dfe-9154-4babd5e3fad0 2c2679cca0d247d1828f85f7ce3bb197 212993276c39412c938b179b82d692f2 - - default default] Lock "d829b039-9b2c-46dc-9950-4e3c4fc9d308-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:36:04 compute-0 nova_compute[192698]: 2025-10-01 14:36:04.157 2 INFO nova.compute.manager [None req-37ff54c6-b650-4dfe-9154-4babd5e3fad0 2c2679cca0d247d1828f85f7ce3bb197 212993276c39412c938b179b82d692f2 - - default default] [instance: d829b039-9b2c-46dc-9950-4e3c4fc9d308] Terminating instance
Oct 01 14:36:04 compute-0 nova_compute[192698]: 2025-10-01 14:36:04.677 2 DEBUG nova.compute.manager [None req-37ff54c6-b650-4dfe-9154-4babd5e3fad0 2c2679cca0d247d1828f85f7ce3bb197 212993276c39412c938b179b82d692f2 - - default default] [instance: d829b039-9b2c-46dc-9950-4e3c4fc9d308] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 01 14:36:04 compute-0 kernel: tap4a475952-24 (unregistering): left promiscuous mode
Oct 01 14:36:04 compute-0 NetworkManager[51741]: <info>  [1759329364.7041] device (tap4a475952-24): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 01 14:36:04 compute-0 ovn_controller[94909]: 2025-10-01T14:36:04Z|00267|binding|INFO|Releasing lport 4a475952-24c7-4c86-b7af-a18aef022152 from this chassis (sb_readonly=0)
Oct 01 14:36:04 compute-0 nova_compute[192698]: 2025-10-01 14:36:04.719 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:36:04 compute-0 ovn_controller[94909]: 2025-10-01T14:36:04Z|00268|binding|INFO|Setting lport 4a475952-24c7-4c86-b7af-a18aef022152 down in Southbound
Oct 01 14:36:04 compute-0 ovn_controller[94909]: 2025-10-01T14:36:04Z|00269|binding|INFO|Removing iface tap4a475952-24 ovn-installed in OVS
Oct 01 14:36:04 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:36:04.729 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8b:ea:d1 10.100.0.11'], port_security=['fa:16:3e:8b:ea:d1 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'd829b039-9b2c-46dc-9950-4e3c4fc9d308', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-555191a0-aa04-49d4-af46-93b0ff584e2d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '212993276c39412c938b179b82d692f2', 'neutron:revision_number': '15', 'neutron:security_group_ids': '134d03e5-3921-4f89-b7a4-e938969031c7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3857fb1d-1e54-4a99-a828-506b6bdd5885, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], logical_port=4a475952-24c7-4c86-b7af-a18aef022152) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:36:04 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:36:04.730 103791 INFO neutron.agent.ovn.metadata.agent [-] Port 4a475952-24c7-4c86-b7af-a18aef022152 in datapath 555191a0-aa04-49d4-af46-93b0ff584e2d unbound from our chassis
Oct 01 14:36:04 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:36:04.732 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 555191a0-aa04-49d4-af46-93b0ff584e2d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 01 14:36:04 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:36:04.733 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[40b052c3-bc66-41fe-b411-37818f694280]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:36:04 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:36:04.733 103791 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-555191a0-aa04-49d4-af46-93b0ff584e2d namespace which is not needed anymore
Oct 01 14:36:04 compute-0 nova_compute[192698]: 2025-10-01 14:36:04.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:36:04 compute-0 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d00000020.scope: Deactivated successfully.
Oct 01 14:36:04 compute-0 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d00000020.scope: Consumed 3.061s CPU time.
Oct 01 14:36:04 compute-0 systemd-machined[152704]: Machine qemu-25-instance-00000020 terminated.
Oct 01 14:36:04 compute-0 nova_compute[192698]: 2025-10-01 14:36:04.884 2 DEBUG nova.compute.manager [req-efdfe819-4a01-46d2-80b7-43587dbc4384 req-ab8a7c99-1f4e-4d18-9ff0-393fd3dbe905 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: d829b039-9b2c-46dc-9950-4e3c4fc9d308] Received event network-vif-unplugged-4a475952-24c7-4c86-b7af-a18aef022152 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:36:04 compute-0 nova_compute[192698]: 2025-10-01 14:36:04.885 2 DEBUG oslo_concurrency.lockutils [req-efdfe819-4a01-46d2-80b7-43587dbc4384 req-ab8a7c99-1f4e-4d18-9ff0-393fd3dbe905 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "d829b039-9b2c-46dc-9950-4e3c4fc9d308-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:36:04 compute-0 nova_compute[192698]: 2025-10-01 14:36:04.885 2 DEBUG oslo_concurrency.lockutils [req-efdfe819-4a01-46d2-80b7-43587dbc4384 req-ab8a7c99-1f4e-4d18-9ff0-393fd3dbe905 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "d829b039-9b2c-46dc-9950-4e3c4fc9d308-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:36:04 compute-0 nova_compute[192698]: 2025-10-01 14:36:04.885 2 DEBUG oslo_concurrency.lockutils [req-efdfe819-4a01-46d2-80b7-43587dbc4384 req-ab8a7c99-1f4e-4d18-9ff0-393fd3dbe905 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "d829b039-9b2c-46dc-9950-4e3c4fc9d308-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:36:04 compute-0 nova_compute[192698]: 2025-10-01 14:36:04.886 2 DEBUG nova.compute.manager [req-efdfe819-4a01-46d2-80b7-43587dbc4384 req-ab8a7c99-1f4e-4d18-9ff0-393fd3dbe905 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: d829b039-9b2c-46dc-9950-4e3c4fc9d308] No waiting events found dispatching network-vif-unplugged-4a475952-24c7-4c86-b7af-a18aef022152 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:36:04 compute-0 nova_compute[192698]: 2025-10-01 14:36:04.886 2 DEBUG nova.compute.manager [req-efdfe819-4a01-46d2-80b7-43587dbc4384 req-ab8a7c99-1f4e-4d18-9ff0-393fd3dbe905 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: d829b039-9b2c-46dc-9950-4e3c4fc9d308] Received event network-vif-unplugged-4a475952-24c7-4c86-b7af-a18aef022152 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 01 14:36:04 compute-0 neutron-haproxy-ovnmeta-555191a0-aa04-49d4-af46-93b0ff584e2d[229151]: [NOTICE]   (229155) : haproxy version is 3.0.5-8e879a5
Oct 01 14:36:04 compute-0 neutron-haproxy-ovnmeta-555191a0-aa04-49d4-af46-93b0ff584e2d[229151]: [NOTICE]   (229155) : path to executable is /usr/sbin/haproxy
Oct 01 14:36:04 compute-0 neutron-haproxy-ovnmeta-555191a0-aa04-49d4-af46-93b0ff584e2d[229151]: [WARNING]  (229155) : Exiting Master process...
Oct 01 14:36:04 compute-0 podman[229292]: 2025-10-01 14:36:04.893337374 +0000 UTC m=+0.049363900 container kill c11575c41db10beee7e530981a62348a4915f059de6d8f4c712af9bb8f405bfa (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-555191a0-aa04-49d4-af46-93b0ff584e2d, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 01 14:36:04 compute-0 neutron-haproxy-ovnmeta-555191a0-aa04-49d4-af46-93b0ff584e2d[229151]: [ALERT]    (229155) : Current worker (229157) exited with code 143 (Terminated)
Oct 01 14:36:04 compute-0 neutron-haproxy-ovnmeta-555191a0-aa04-49d4-af46-93b0ff584e2d[229151]: [WARNING]  (229155) : All workers exited. Exiting... (0)
Oct 01 14:36:04 compute-0 systemd[1]: libpod-c11575c41db10beee7e530981a62348a4915f059de6d8f4c712af9bb8f405bfa.scope: Deactivated successfully.
Oct 01 14:36:04 compute-0 nova_compute[192698]: 2025-10-01 14:36:04.910 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:36:04 compute-0 nova_compute[192698]: 2025-10-01 14:36:04.916 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:36:04 compute-0 podman[229313]: 2025-10-01 14:36:04.962371892 +0000 UTC m=+0.031540930 container died c11575c41db10beee7e530981a62348a4915f059de6d8f4c712af9bb8f405bfa (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-555191a0-aa04-49d4-af46-93b0ff584e2d, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Oct 01 14:36:04 compute-0 nova_compute[192698]: 2025-10-01 14:36:04.974 2 INFO nova.virt.libvirt.driver [-] [instance: d829b039-9b2c-46dc-9950-4e3c4fc9d308] Instance destroyed successfully.
Oct 01 14:36:04 compute-0 nova_compute[192698]: 2025-10-01 14:36:04.976 2 DEBUG nova.objects.instance [None req-37ff54c6-b650-4dfe-9154-4babd5e3fad0 2c2679cca0d247d1828f85f7ce3bb197 212993276c39412c938b179b82d692f2 - - default default] Lazy-loading 'resources' on Instance uuid d829b039-9b2c-46dc-9950-4e3c4fc9d308 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:36:04 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c11575c41db10beee7e530981a62348a4915f059de6d8f4c712af9bb8f405bfa-userdata-shm.mount: Deactivated successfully.
Oct 01 14:36:04 compute-0 systemd[1]: var-lib-containers-storage-overlay-ca92e2dc1f5bcdd23d6c5c5395cdab9447dc235081d0383da8b9b7aec0a8512a-merged.mount: Deactivated successfully.
Oct 01 14:36:05 compute-0 podman[229313]: 2025-10-01 14:36:05.024003221 +0000 UTC m=+0.093172249 container remove c11575c41db10beee7e530981a62348a4915f059de6d8f4c712af9bb8f405bfa (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-555191a0-aa04-49d4-af46-93b0ff584e2d, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 01 14:36:05 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:36:05.033 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[03541a9c-e5a4-43e4-911c-ab5232a08d1e]: (4, ("Wed Oct  1 02:36:04 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-555191a0-aa04-49d4-af46-93b0ff584e2d (c11575c41db10beee7e530981a62348a4915f059de6d8f4c712af9bb8f405bfa)\nc11575c41db10beee7e530981a62348a4915f059de6d8f4c712af9bb8f405bfa\nWed Oct  1 02:36:04 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-555191a0-aa04-49d4-af46-93b0ff584e2d (c11575c41db10beee7e530981a62348a4915f059de6d8f4c712af9bb8f405bfa)\nc11575c41db10beee7e530981a62348a4915f059de6d8f4c712af9bb8f405bfa\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:36:05 compute-0 systemd[1]: libpod-conmon-c11575c41db10beee7e530981a62348a4915f059de6d8f4c712af9bb8f405bfa.scope: Deactivated successfully.
Oct 01 14:36:05 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:36:05.035 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[c60a749f-5020-47b7-af53-c6b30c46589d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:36:05 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:36:05.036 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/555191a0-aa04-49d4-af46-93b0ff584e2d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/555191a0-aa04-49d4-af46-93b0ff584e2d.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:36:05 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:36:05.037 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[70e2092c-a199-4b0b-8620-52a2df95ee58]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:36:05 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:36:05.038 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap555191a0-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:36:05 compute-0 nova_compute[192698]: 2025-10-01 14:36:05.040 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:36:05 compute-0 kernel: tap555191a0-a0: left promiscuous mode
Oct 01 14:36:05 compute-0 nova_compute[192698]: 2025-10-01 14:36:05.069 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:36:05 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:36:05.073 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[7dbfb18e-9698-4d77-8d17-e8f00db45871]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:36:05 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:36:05.114 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[9c2560b4-53b2-4700-8862-8cb3f4d1b947]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:36:05 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:36:05.116 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[613d490f-df23-4732-85fa-2a315c99c2b5]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:36:05 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:36:05.140 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[c65a6246-3dfe-40b2-8a0a-4a78ff99f365]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 570231, 'reachable_time': 21056, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229352, 'error': None, 'target': 'ovnmeta-555191a0-aa04-49d4-af46-93b0ff584e2d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:36:05 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:36:05.144 103910 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-555191a0-aa04-49d4-af46-93b0ff584e2d deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Oct 01 14:36:05 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:36:05.145 103910 DEBUG oslo.privsep.daemon [-] privsep: reply[260a358c-c9e3-4aba-9267-3e8d4bc3046c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:36:05 compute-0 systemd[1]: run-netns-ovnmeta\x2d555191a0\x2daa04\x2d49d4\x2daf46\x2d93b0ff584e2d.mount: Deactivated successfully.
Oct 01 14:36:05 compute-0 nova_compute[192698]: 2025-10-01 14:36:05.451 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:36:05 compute-0 nova_compute[192698]: 2025-10-01 14:36:05.483 2 DEBUG nova.virt.libvirt.vif [None req-37ff54c6-b650-4dfe-9154-4babd5e3fad0 2c2679cca0d247d1828f85f7ce3bb197 212993276c39412c938b179b82d692f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-10-01T14:34:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-1186792317',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(3),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-1186792317',id=32,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=3,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-01T14:34:50Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1151,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='212993276c39412c938b179b82d692f2',ramdisk_id='',reservation_id='r-4mr0e38m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader,manager',clean_attempts='1',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-1252378341',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-1252378341-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-01T14:35:53Z,user_data=None,user_id='2c2679cca0d247d1828f85f7ce3bb197',uuid=d829b039-9b2c-46dc-9950-4e3c4fc9d308,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4a475952-24c7-4c86-b7af-a18aef022152", "address": "fa:16:3e:8b:ea:d1", "network": {"id": "555191a0-aa04-49d4-af46-93b0ff584e2d", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-742538066-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c0ce6811a65d40628bfc69d5eb9bcf01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a475952-24", "ovs_interfaceid": "4a475952-24c7-4c86-b7af-a18aef022152", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 01 14:36:05 compute-0 nova_compute[192698]: 2025-10-01 14:36:05.484 2 DEBUG nova.network.os_vif_util [None req-37ff54c6-b650-4dfe-9154-4babd5e3fad0 2c2679cca0d247d1828f85f7ce3bb197 212993276c39412c938b179b82d692f2 - - default default] Converting VIF {"id": "4a475952-24c7-4c86-b7af-a18aef022152", "address": "fa:16:3e:8b:ea:d1", "network": {"id": "555191a0-aa04-49d4-af46-93b0ff584e2d", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-742538066-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c0ce6811a65d40628bfc69d5eb9bcf01", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a475952-24", "ovs_interfaceid": "4a475952-24c7-4c86-b7af-a18aef022152", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:36:05 compute-0 nova_compute[192698]: 2025-10-01 14:36:05.485 2 DEBUG nova.network.os_vif_util [None req-37ff54c6-b650-4dfe-9154-4babd5e3fad0 2c2679cca0d247d1828f85f7ce3bb197 212993276c39412c938b179b82d692f2 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8b:ea:d1,bridge_name='br-int',has_traffic_filtering=True,id=4a475952-24c7-4c86-b7af-a18aef022152,network=Network(555191a0-aa04-49d4-af46-93b0ff584e2d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a475952-24') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:36:05 compute-0 nova_compute[192698]: 2025-10-01 14:36:05.485 2 DEBUG os_vif [None req-37ff54c6-b650-4dfe-9154-4babd5e3fad0 2c2679cca0d247d1828f85f7ce3bb197 212993276c39412c938b179b82d692f2 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:8b:ea:d1,bridge_name='br-int',has_traffic_filtering=True,id=4a475952-24c7-4c86-b7af-a18aef022152,network=Network(555191a0-aa04-49d4-af46-93b0ff584e2d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a475952-24') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 01 14:36:05 compute-0 nova_compute[192698]: 2025-10-01 14:36:05.487 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:36:05 compute-0 nova_compute[192698]: 2025-10-01 14:36:05.487 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4a475952-24, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:36:05 compute-0 nova_compute[192698]: 2025-10-01 14:36:05.489 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:36:05 compute-0 nova_compute[192698]: 2025-10-01 14:36:05.491 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:36:05 compute-0 nova_compute[192698]: 2025-10-01 14:36:05.491 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:36:05 compute-0 nova_compute[192698]: 2025-10-01 14:36:05.492 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:36:05 compute-0 nova_compute[192698]: 2025-10-01 14:36:05.492 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=66db463c-6e91-47bc-9e16-02d02f575b3a) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:36:05 compute-0 nova_compute[192698]: 2025-10-01 14:36:05.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:36:05 compute-0 nova_compute[192698]: 2025-10-01 14:36:05.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:36:05 compute-0 nova_compute[192698]: 2025-10-01 14:36:05.496 2 INFO os_vif [None req-37ff54c6-b650-4dfe-9154-4babd5e3fad0 2c2679cca0d247d1828f85f7ce3bb197 212993276c39412c938b179b82d692f2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:8b:ea:d1,bridge_name='br-int',has_traffic_filtering=True,id=4a475952-24c7-4c86-b7af-a18aef022152,network=Network(555191a0-aa04-49d4-af46-93b0ff584e2d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a475952-24')
Oct 01 14:36:05 compute-0 nova_compute[192698]: 2025-10-01 14:36:05.496 2 INFO nova.virt.libvirt.driver [None req-37ff54c6-b650-4dfe-9154-4babd5e3fad0 2c2679cca0d247d1828f85f7ce3bb197 212993276c39412c938b179b82d692f2 - - default default] [instance: d829b039-9b2c-46dc-9950-4e3c4fc9d308] Deleting instance files /var/lib/nova/instances/d829b039-9b2c-46dc-9950-4e3c4fc9d308_del
Oct 01 14:36:05 compute-0 nova_compute[192698]: 2025-10-01 14:36:05.497 2 INFO nova.virt.libvirt.driver [None req-37ff54c6-b650-4dfe-9154-4babd5e3fad0 2c2679cca0d247d1828f85f7ce3bb197 212993276c39412c938b179b82d692f2 - - default default] [instance: d829b039-9b2c-46dc-9950-4e3c4fc9d308] Deletion of /var/lib/nova/instances/d829b039-9b2c-46dc-9950-4e3c4fc9d308_del complete
Oct 01 14:36:06 compute-0 nova_compute[192698]: 2025-10-01 14:36:06.008 2 INFO nova.compute.manager [None req-37ff54c6-b650-4dfe-9154-4babd5e3fad0 2c2679cca0d247d1828f85f7ce3bb197 212993276c39412c938b179b82d692f2 - - default default] [instance: d829b039-9b2c-46dc-9950-4e3c4fc9d308] Took 1.33 seconds to destroy the instance on the hypervisor.
Oct 01 14:36:06 compute-0 nova_compute[192698]: 2025-10-01 14:36:06.009 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-37ff54c6-b650-4dfe-9154-4babd5e3fad0 2c2679cca0d247d1828f85f7ce3bb197 212993276c39412c938b179b82d692f2 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 01 14:36:06 compute-0 nova_compute[192698]: 2025-10-01 14:36:06.009 2 DEBUG nova.compute.manager [-] [instance: d829b039-9b2c-46dc-9950-4e3c4fc9d308] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 01 14:36:06 compute-0 nova_compute[192698]: 2025-10-01 14:36:06.009 2 DEBUG nova.network.neutron [-] [instance: d829b039-9b2c-46dc-9950-4e3c4fc9d308] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 01 14:36:06 compute-0 nova_compute[192698]: 2025-10-01 14:36:06.010 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:36:06 compute-0 nova_compute[192698]: 2025-10-01 14:36:06.812 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:36:06 compute-0 nova_compute[192698]: 2025-10-01 14:36:06.939 2 DEBUG nova.compute.manager [req-4e93878b-d0dd-46ad-a27d-3f31eef0f3a3 req-0779c0cc-716a-465e-a322-410f3204b8b7 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: d829b039-9b2c-46dc-9950-4e3c4fc9d308] Received event network-vif-unplugged-4a475952-24c7-4c86-b7af-a18aef022152 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:36:06 compute-0 nova_compute[192698]: 2025-10-01 14:36:06.940 2 DEBUG oslo_concurrency.lockutils [req-4e93878b-d0dd-46ad-a27d-3f31eef0f3a3 req-0779c0cc-716a-465e-a322-410f3204b8b7 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "d829b039-9b2c-46dc-9950-4e3c4fc9d308-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:36:06 compute-0 nova_compute[192698]: 2025-10-01 14:36:06.941 2 DEBUG oslo_concurrency.lockutils [req-4e93878b-d0dd-46ad-a27d-3f31eef0f3a3 req-0779c0cc-716a-465e-a322-410f3204b8b7 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "d829b039-9b2c-46dc-9950-4e3c4fc9d308-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:36:06 compute-0 nova_compute[192698]: 2025-10-01 14:36:06.941 2 DEBUG oslo_concurrency.lockutils [req-4e93878b-d0dd-46ad-a27d-3f31eef0f3a3 req-0779c0cc-716a-465e-a322-410f3204b8b7 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "d829b039-9b2c-46dc-9950-4e3c4fc9d308-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:36:06 compute-0 nova_compute[192698]: 2025-10-01 14:36:06.941 2 DEBUG nova.compute.manager [req-4e93878b-d0dd-46ad-a27d-3f31eef0f3a3 req-0779c0cc-716a-465e-a322-410f3204b8b7 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: d829b039-9b2c-46dc-9950-4e3c4fc9d308] No waiting events found dispatching network-vif-unplugged-4a475952-24c7-4c86-b7af-a18aef022152 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:36:06 compute-0 nova_compute[192698]: 2025-10-01 14:36:06.942 2 DEBUG nova.compute.manager [req-4e93878b-d0dd-46ad-a27d-3f31eef0f3a3 req-0779c0cc-716a-465e-a322-410f3204b8b7 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: d829b039-9b2c-46dc-9950-4e3c4fc9d308] Received event network-vif-unplugged-4a475952-24c7-4c86-b7af-a18aef022152 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 01 14:36:07 compute-0 nova_compute[192698]: 2025-10-01 14:36:07.106 2 DEBUG nova.compute.manager [req-3147b22e-6cfb-412e-b206-8862889fd3a8 req-89185541-2d03-4736-b826-8787559e0bc9 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: d829b039-9b2c-46dc-9950-4e3c4fc9d308] Received event network-vif-deleted-4a475952-24c7-4c86-b7af-a18aef022152 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:36:07 compute-0 nova_compute[192698]: 2025-10-01 14:36:07.107 2 INFO nova.compute.manager [req-3147b22e-6cfb-412e-b206-8862889fd3a8 req-89185541-2d03-4736-b826-8787559e0bc9 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: d829b039-9b2c-46dc-9950-4e3c4fc9d308] Neutron deleted interface 4a475952-24c7-4c86-b7af-a18aef022152; detaching it from the instance and deleting it from the info cache
Oct 01 14:36:07 compute-0 nova_compute[192698]: 2025-10-01 14:36:07.108 2 DEBUG nova.network.neutron [req-3147b22e-6cfb-412e-b206-8862889fd3a8 req-89185541-2d03-4736-b826-8787559e0bc9 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: d829b039-9b2c-46dc-9950-4e3c4fc9d308] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:36:07 compute-0 nova_compute[192698]: 2025-10-01 14:36:07.530 2 DEBUG nova.network.neutron [-] [instance: d829b039-9b2c-46dc-9950-4e3c4fc9d308] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:36:07 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:36:07.532 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=10cf9814-09fa-4bad-879a-270f9b64eda3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '33'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:36:07 compute-0 nova_compute[192698]: 2025-10-01 14:36:07.620 2 DEBUG nova.compute.manager [req-3147b22e-6cfb-412e-b206-8862889fd3a8 req-89185541-2d03-4736-b826-8787559e0bc9 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: d829b039-9b2c-46dc-9950-4e3c4fc9d308] Detach interface failed, port_id=4a475952-24c7-4c86-b7af-a18aef022152, reason: Instance d829b039-9b2c-46dc-9950-4e3c4fc9d308 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Oct 01 14:36:08 compute-0 nova_compute[192698]: 2025-10-01 14:36:08.039 2 INFO nova.compute.manager [-] [instance: d829b039-9b2c-46dc-9950-4e3c4fc9d308] Took 2.03 seconds to deallocate network for instance.
Oct 01 14:36:08 compute-0 nova_compute[192698]: 2025-10-01 14:36:08.560 2 DEBUG oslo_concurrency.lockutils [None req-37ff54c6-b650-4dfe-9154-4babd5e3fad0 2c2679cca0d247d1828f85f7ce3bb197 212993276c39412c938b179b82d692f2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:36:08 compute-0 nova_compute[192698]: 2025-10-01 14:36:08.561 2 DEBUG oslo_concurrency.lockutils [None req-37ff54c6-b650-4dfe-9154-4babd5e3fad0 2c2679cca0d247d1828f85f7ce3bb197 212993276c39412c938b179b82d692f2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:36:08 compute-0 nova_compute[192698]: 2025-10-01 14:36:08.568 2 DEBUG oslo_concurrency.lockutils [None req-37ff54c6-b650-4dfe-9154-4babd5e3fad0 2c2679cca0d247d1828f85f7ce3bb197 212993276c39412c938b179b82d692f2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.007s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:36:08 compute-0 nova_compute[192698]: 2025-10-01 14:36:08.614 2 INFO nova.scheduler.client.report [None req-37ff54c6-b650-4dfe-9154-4babd5e3fad0 2c2679cca0d247d1828f85f7ce3bb197 212993276c39412c938b179b82d692f2 - - default default] Deleted allocations for instance d829b039-9b2c-46dc-9950-4e3c4fc9d308
Oct 01 14:36:09 compute-0 nova_compute[192698]: 2025-10-01 14:36:09.653 2 DEBUG oslo_concurrency.lockutils [None req-37ff54c6-b650-4dfe-9154-4babd5e3fad0 2c2679cca0d247d1828f85f7ce3bb197 212993276c39412c938b179b82d692f2 - - default default] Lock "d829b039-9b2c-46dc-9950-4e3c4fc9d308" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.513s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:36:10 compute-0 podman[229353]: 2025-10-01 14:36:10.160185939 +0000 UTC m=+0.073163831 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct 01 14:36:10 compute-0 podman[229354]: 2025-10-01 14:36:10.213150514 +0000 UTC m=+0.121995484 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Oct 01 14:36:10 compute-0 nova_compute[192698]: 2025-10-01 14:36:10.480 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:36:10 compute-0 nova_compute[192698]: 2025-10-01 14:36:10.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:36:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:36:14.307 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:36:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:36:14.307 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:36:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:36:14.307 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:36:15 compute-0 nova_compute[192698]: 2025-10-01 14:36:15.495 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:36:15 compute-0 nova_compute[192698]: 2025-10-01 14:36:15.550 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:36:15 compute-0 nova_compute[192698]: 2025-10-01 14:36:15.553 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:36:17 compute-0 podman[229401]: 2025-10-01 14:36:17.165122596 +0000 UTC m=+0.071894716 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, config_id=edpm, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.tags=minimal rhel9, version=9.6, container_name=openstack_network_exporter, architecture=x86_64, managed_by=edpm_ansible, name=ubi9-minimal)
Oct 01 14:36:20 compute-0 nova_compute[192698]: 2025-10-01 14:36:20.498 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:36:20 compute-0 nova_compute[192698]: 2025-10-01 14:36:20.554 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:36:23 compute-0 podman[229423]: 2025-10-01 14:36:23.144629563 +0000 UTC m=+0.062621687 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 01 14:36:23 compute-0 podman[229424]: 2025-10-01 14:36:23.160172151 +0000 UTC m=+0.073520390 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 01 14:36:25 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:36:25.186 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f5:25:47 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-b2141291-fcc5-46d8-add3-8cd850ddcd19', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b2141291-fcc5-46d8-add3-8cd850ddcd19', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cf1172c50b76471d8a1bc0e716e11d6a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=352c0b1e-6c3a-4c51-8519-b648a3ff7e09, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=897047c2-fb20-40fa-b1ad-e3997cba428d) old=Port_Binding(mac=['fa:16:3e:f5:25:47'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-b2141291-fcc5-46d8-add3-8cd850ddcd19', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b2141291-fcc5-46d8-add3-8cd850ddcd19', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cf1172c50b76471d8a1bc0e716e11d6a', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:36:25 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:36:25.187 103791 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 897047c2-fb20-40fa-b1ad-e3997cba428d in datapath b2141291-fcc5-46d8-add3-8cd850ddcd19 updated
Oct 01 14:36:25 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:36:25.188 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b2141291-fcc5-46d8-add3-8cd850ddcd19, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 01 14:36:25 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:36:25.189 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[62d74481-34ea-4eb1-8af8-7a3bd1877db1]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:36:25 compute-0 nova_compute[192698]: 2025-10-01 14:36:25.501 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:36:25 compute-0 nova_compute[192698]: 2025-10-01 14:36:25.556 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:36:29 compute-0 podman[229462]: 2025-10-01 14:36:29.183799525 +0000 UTC m=+0.086075798 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 01 14:36:29 compute-0 podman[203144]: time="2025-10-01T14:36:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:36:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:36:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:36:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:36:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3029 "" "Go-http-client/1.1"
Oct 01 14:36:30 compute-0 nova_compute[192698]: 2025-10-01 14:36:30.504 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:36:30 compute-0 nova_compute[192698]: 2025-10-01 14:36:30.557 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:36:31 compute-0 openstack_network_exporter[205307]: ERROR   14:36:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:36:31 compute-0 openstack_network_exporter[205307]: ERROR   14:36:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:36:31 compute-0 openstack_network_exporter[205307]: ERROR   14:36:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:36:31 compute-0 openstack_network_exporter[205307]: ERROR   14:36:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:36:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:36:31 compute-0 openstack_network_exporter[205307]: ERROR   14:36:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:36:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:36:33 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:36:33.101 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:df:9b:d8 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-c8f3c31a-ae68-43bf-85dc-67da058a6914', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c8f3c31a-ae68-43bf-85dc-67da058a6914', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5dae8700d24b4283beb3d402c248bc67', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=94ec44d9-5e1d-4b26-b93e-d8165233592e, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=29add452-a2f5-4949-bb0a-ce938e83ce0a) old=Port_Binding(mac=['fa:16:3e:df:9b:d8'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-c8f3c31a-ae68-43bf-85dc-67da058a6914', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c8f3c31a-ae68-43bf-85dc-67da058a6914', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5dae8700d24b4283beb3d402c248bc67', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:36:33 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:36:33.103 103791 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 29add452-a2f5-4949-bb0a-ce938e83ce0a in datapath c8f3c31a-ae68-43bf-85dc-67da058a6914 updated
Oct 01 14:36:33 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:36:33.104 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c8f3c31a-ae68-43bf-85dc-67da058a6914, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 01 14:36:33 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:36:33.106 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[bbeeaef2-1a05-44f8-8359-00ad237a7e18]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:36:34 compute-0 nova_compute[192698]: 2025-10-01 14:36:34.926 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:36:35 compute-0 nova_compute[192698]: 2025-10-01 14:36:35.506 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:36:35 compute-0 nova_compute[192698]: 2025-10-01 14:36:35.560 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:36:35 compute-0 nova_compute[192698]: 2025-10-01 14:36:35.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:36:36 compute-0 nova_compute[192698]: 2025-10-01 14:36:36.445 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:36:36 compute-0 nova_compute[192698]: 2025-10-01 14:36:36.445 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:36:36 compute-0 nova_compute[192698]: 2025-10-01 14:36:36.445 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:36:36 compute-0 nova_compute[192698]: 2025-10-01 14:36:36.446 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 01 14:36:36 compute-0 nova_compute[192698]: 2025-10-01 14:36:36.680 2 WARNING nova.virt.libvirt.driver [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:36:36 compute-0 nova_compute[192698]: 2025-10-01 14:36:36.681 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:36:36 compute-0 nova_compute[192698]: 2025-10-01 14:36:36.705 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.024s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:36:36 compute-0 nova_compute[192698]: 2025-10-01 14:36:36.706 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5848MB free_disk=73.30203628540039GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 01 14:36:36 compute-0 nova_compute[192698]: 2025-10-01 14:36:36.707 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:36:36 compute-0 nova_compute[192698]: 2025-10-01 14:36:36.707 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:36:37 compute-0 nova_compute[192698]: 2025-10-01 14:36:37.765 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 01 14:36:37 compute-0 nova_compute[192698]: 2025-10-01 14:36:37.766 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:36:36 up  1:35,  0 user,  load average: 0.10, 0.12, 0.20\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 01 14:36:37 compute-0 nova_compute[192698]: 2025-10-01 14:36:37.797 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:36:38 compute-0 nova_compute[192698]: 2025-10-01 14:36:38.313 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:36:38 compute-0 nova_compute[192698]: 2025-10-01 14:36:38.822 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 01 14:36:38 compute-0 nova_compute[192698]: 2025-10-01 14:36:38.823 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.115s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:36:39 compute-0 nova_compute[192698]: 2025-10-01 14:36:39.823 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:36:39 compute-0 nova_compute[192698]: 2025-10-01 14:36:39.824 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:36:39 compute-0 nova_compute[192698]: 2025-10-01 14:36:39.825 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:36:40 compute-0 nova_compute[192698]: 2025-10-01 14:36:40.508 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:36:40 compute-0 nova_compute[192698]: 2025-10-01 14:36:40.609 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:36:41 compute-0 unix_chkpwd[229510]: password check failed for user (root)
Oct 01 14:36:41 compute-0 sshd-session[229488]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.233  user=root
Oct 01 14:36:41 compute-0 podman[229490]: 2025-10-01 14:36:41.136897098 +0000 UTC m=+0.058068104 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent)
Oct 01 14:36:41 compute-0 podman[229491]: 2025-10-01 14:36:41.18450038 +0000 UTC m=+0.100084205 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct 01 14:36:42 compute-0 nova_compute[192698]: 2025-10-01 14:36:42.915 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:36:43 compute-0 sshd-session[229488]: Failed password for root from 80.94.93.233 port 28588 ssh2
Oct 01 14:36:43 compute-0 unix_chkpwd[229534]: password check failed for user (root)
Oct 01 14:36:43 compute-0 nova_compute[192698]: 2025-10-01 14:36:43.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:36:43 compute-0 nova_compute[192698]: 2025-10-01 14:36:43.926 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 01 14:36:44 compute-0 nova_compute[192698]: 2025-10-01 14:36:44.927 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:36:45 compute-0 nova_compute[192698]: 2025-10-01 14:36:45.511 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:36:45 compute-0 nova_compute[192698]: 2025-10-01 14:36:45.612 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:36:45 compute-0 sshd-session[229488]: Failed password for root from 80.94.93.233 port 28588 ssh2
Oct 01 14:36:46 compute-0 unix_chkpwd[229535]: password check failed for user (root)
Oct 01 14:36:48 compute-0 podman[229536]: 2025-10-01 14:36:48.188515211 +0000 UTC m=+0.100905477 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, release=1755695350, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_id=edpm, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible)
Oct 01 14:36:48 compute-0 sshd-session[229488]: Failed password for root from 80.94.93.233 port 28588 ssh2
Oct 01 14:36:48 compute-0 ovn_controller[94909]: 2025-10-01T14:36:48Z|00270|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Oct 01 14:36:49 compute-0 sshd-session[229488]: Received disconnect from 80.94.93.233 port 28588:11:  [preauth]
Oct 01 14:36:49 compute-0 sshd-session[229488]: Disconnected from authenticating user root 80.94.93.233 port 28588 [preauth]
Oct 01 14:36:49 compute-0 sshd-session[229488]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.233  user=root
Oct 01 14:36:50 compute-0 unix_chkpwd[229559]: password check failed for user (root)
Oct 01 14:36:50 compute-0 sshd-session[229557]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.233  user=root
Oct 01 14:36:50 compute-0 nova_compute[192698]: 2025-10-01 14:36:50.514 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:36:50 compute-0 nova_compute[192698]: 2025-10-01 14:36:50.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:36:52 compute-0 sshd-session[229557]: Failed password for root from 80.94.93.233 port 41342 ssh2
Oct 01 14:36:53 compute-0 unix_chkpwd[229560]: password check failed for user (root)
Oct 01 14:36:54 compute-0 podman[229562]: 2025-10-01 14:36:54.17144325 +0000 UTC m=+0.088497964 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, container_name=multipathd)
Oct 01 14:36:54 compute-0 podman[229561]: 2025-10-01 14:36:54.178690485 +0000 UTC m=+0.099410878 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Oct 01 14:36:55 compute-0 sshd-session[229557]: Failed password for root from 80.94.93.233 port 41342 ssh2
Oct 01 14:36:55 compute-0 nova_compute[192698]: 2025-10-01 14:36:55.516 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:36:55 compute-0 nova_compute[192698]: 2025-10-01 14:36:55.617 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:36:55 compute-0 unix_chkpwd[229601]: password check failed for user (root)
Oct 01 14:36:58 compute-0 sshd-session[229557]: Failed password for root from 80.94.93.233 port 41342 ssh2
Oct 01 14:36:58 compute-0 sshd-session[229557]: Received disconnect from 80.94.93.233 port 41342:11:  [preauth]
Oct 01 14:36:58 compute-0 sshd-session[229557]: Disconnected from authenticating user root 80.94.93.233 port 41342 [preauth]
Oct 01 14:36:58 compute-0 sshd-session[229557]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.233  user=root
Oct 01 14:36:59 compute-0 unix_chkpwd[229606]: password check failed for user (root)
Oct 01 14:36:59 compute-0 sshd-session[229603]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.233  user=root
Oct 01 14:36:59 compute-0 sshd-session[229602]: Invalid user admin from 185.156.73.233 port 40108
Oct 01 14:36:59 compute-0 podman[203144]: time="2025-10-01T14:36:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:36:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:36:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:36:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:36:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3029 "" "Go-http-client/1.1"
Oct 01 14:36:59 compute-0 podman[229607]: 2025-10-01 14:36:59.82004319 +0000 UTC m=+0.083306963 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 01 14:36:59 compute-0 sshd-session[229602]: pam_unix(sshd:auth): check pass; user unknown
Oct 01 14:36:59 compute-0 sshd-session[229602]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=185.156.73.233
Oct 01 14:37:00 compute-0 nova_compute[192698]: 2025-10-01 14:37:00.530 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:37:00 compute-0 nova_compute[192698]: 2025-10-01 14:37:00.621 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:37:01 compute-0 openstack_network_exporter[205307]: ERROR   14:37:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:37:01 compute-0 openstack_network_exporter[205307]: ERROR   14:37:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:37:01 compute-0 openstack_network_exporter[205307]: ERROR   14:37:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:37:01 compute-0 openstack_network_exporter[205307]: ERROR   14:37:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:37:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:37:01 compute-0 openstack_network_exporter[205307]: ERROR   14:37:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:37:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:37:01 compute-0 sshd-session[229603]: Failed password for root from 80.94.93.233 port 22322 ssh2
Oct 01 14:37:02 compute-0 sshd-session[229602]: Failed password for invalid user admin from 185.156.73.233 port 40108 ssh2
Oct 01 14:37:02 compute-0 sshd-session[229602]: Connection closed by invalid user admin 185.156.73.233 port 40108 [preauth]
Oct 01 14:37:02 compute-0 unix_chkpwd[229629]: password check failed for user (root)
Oct 01 14:37:04 compute-0 sshd-session[229603]: Failed password for root from 80.94.93.233 port 22322 ssh2
Oct 01 14:37:05 compute-0 unix_chkpwd[229630]: password check failed for user (root)
Oct 01 14:37:05 compute-0 nova_compute[192698]: 2025-10-01 14:37:05.533 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:37:05 compute-0 nova_compute[192698]: 2025-10-01 14:37:05.623 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:37:07 compute-0 sshd-session[229603]: Failed password for root from 80.94.93.233 port 22322 ssh2
Oct 01 14:37:08 compute-0 sshd-session[229603]: Received disconnect from 80.94.93.233 port 22322:11:  [preauth]
Oct 01 14:37:08 compute-0 sshd-session[229603]: Disconnected from authenticating user root 80.94.93.233 port 22322 [preauth]
Oct 01 14:37:08 compute-0 sshd-session[229603]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.233  user=root
Oct 01 14:37:10 compute-0 nova_compute[192698]: 2025-10-01 14:37:10.535 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:37:10 compute-0 nova_compute[192698]: 2025-10-01 14:37:10.626 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:37:12 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:37:12.055 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=34, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'e2:3f:3c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:1d:a6:67:ed:e6'}, ipsec=False) old=SB_Global(nb_cfg=33) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:37:12 compute-0 nova_compute[192698]: 2025-10-01 14:37:12.055 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:37:12 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:37:12.057 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 01 14:37:12 compute-0 podman[229632]: 2025-10-01 14:37:12.156862962 +0000 UTC m=+0.060054888 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.4)
Oct 01 14:37:12 compute-0 podman[229633]: 2025-10-01 14:37:12.172355989 +0000 UTC m=+0.080260442 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest)
Oct 01 14:37:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:37:14.308 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:37:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:37:14.309 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:37:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:37:14.310 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:37:15 compute-0 nova_compute[192698]: 2025-10-01 14:37:15.587 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:37:15 compute-0 nova_compute[192698]: 2025-10-01 14:37:15.628 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:37:19 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:37:19.059 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=10cf9814-09fa-4bad-879a-270f9b64eda3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '34'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:37:19 compute-0 podman[229679]: 2025-10-01 14:37:19.179078413 +0000 UTC m=+0.086162170 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, io.buildah.version=1.33.7, name=ubi9-minimal, vcs-type=git, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.openshift.tags=minimal rhel9)
Oct 01 14:37:20 compute-0 nova_compute[192698]: 2025-10-01 14:37:20.589 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:37:20 compute-0 nova_compute[192698]: 2025-10-01 14:37:20.630 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:37:25 compute-0 podman[229703]: 2025-10-01 14:37:25.138974351 +0000 UTC m=+0.058786683 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible)
Oct 01 14:37:25 compute-0 podman[229702]: 2025-10-01 14:37:25.149783342 +0000 UTC m=+0.071256909 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct 01 14:37:25 compute-0 nova_compute[192698]: 2025-10-01 14:37:25.591 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:37:25 compute-0 nova_compute[192698]: 2025-10-01 14:37:25.630 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:37:29 compute-0 podman[203144]: time="2025-10-01T14:37:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:37:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:37:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:37:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:37:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3031 "" "Go-http-client/1.1"
Oct 01 14:37:30 compute-0 podman[229741]: 2025-10-01 14:37:30.146496176 +0000 UTC m=+0.064260952 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 01 14:37:30 compute-0 nova_compute[192698]: 2025-10-01 14:37:30.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:37:30 compute-0 nova_compute[192698]: 2025-10-01 14:37:30.634 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:37:31 compute-0 openstack_network_exporter[205307]: ERROR   14:37:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:37:31 compute-0 openstack_network_exporter[205307]: ERROR   14:37:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:37:31 compute-0 openstack_network_exporter[205307]: ERROR   14:37:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:37:31 compute-0 openstack_network_exporter[205307]: ERROR   14:37:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:37:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:37:31 compute-0 openstack_network_exporter[205307]: ERROR   14:37:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:37:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:37:35 compute-0 nova_compute[192698]: 2025-10-01 14:37:35.636 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:37:35 compute-0 nova_compute[192698]: 2025-10-01 14:37:35.638 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:37:35 compute-0 nova_compute[192698]: 2025-10-01 14:37:35.638 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Oct 01 14:37:35 compute-0 nova_compute[192698]: 2025-10-01 14:37:35.638 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Oct 01 14:37:35 compute-0 nova_compute[192698]: 2025-10-01 14:37:35.639 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Oct 01 14:37:35 compute-0 nova_compute[192698]: 2025-10-01 14:37:35.640 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:37:35 compute-0 nova_compute[192698]: 2025-10-01 14:37:35.924 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:37:36 compute-0 nova_compute[192698]: 2025-10-01 14:37:36.443 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:37:36 compute-0 nova_compute[192698]: 2025-10-01 14:37:36.444 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:37:36 compute-0 nova_compute[192698]: 2025-10-01 14:37:36.445 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:37:36 compute-0 nova_compute[192698]: 2025-10-01 14:37:36.445 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 01 14:37:36 compute-0 nova_compute[192698]: 2025-10-01 14:37:36.592 2 WARNING nova.virt.libvirt.driver [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:37:36 compute-0 nova_compute[192698]: 2025-10-01 14:37:36.594 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:37:36 compute-0 nova_compute[192698]: 2025-10-01 14:37:36.617 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.023s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:37:36 compute-0 nova_compute[192698]: 2025-10-01 14:37:36.618 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5848MB free_disk=73.30204010009766GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 01 14:37:36 compute-0 nova_compute[192698]: 2025-10-01 14:37:36.618 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:37:36 compute-0 nova_compute[192698]: 2025-10-01 14:37:36.618 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:37:37 compute-0 nova_compute[192698]: 2025-10-01 14:37:37.680 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 01 14:37:37 compute-0 nova_compute[192698]: 2025-10-01 14:37:37.680 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:37:36 up  1:36,  0 user,  load average: 0.38, 0.19, 0.22\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 01 14:37:37 compute-0 nova_compute[192698]: 2025-10-01 14:37:37.711 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:37:38 compute-0 nova_compute[192698]: 2025-10-01 14:37:38.219 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:37:38 compute-0 nova_compute[192698]: 2025-10-01 14:37:38.731 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 01 14:37:38 compute-0 nova_compute[192698]: 2025-10-01 14:37:38.732 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.114s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:37:39 compute-0 nova_compute[192698]: 2025-10-01 14:37:39.733 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:37:39 compute-0 nova_compute[192698]: 2025-10-01 14:37:39.734 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:37:39 compute-0 nova_compute[192698]: 2025-10-01 14:37:39.734 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:37:39 compute-0 nova_compute[192698]: 2025-10-01 14:37:39.924 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:37:40 compute-0 nova_compute[192698]: 2025-10-01 14:37:40.640 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:37:42 compute-0 nova_compute[192698]: 2025-10-01 14:37:42.914 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:37:42 compute-0 nova_compute[192698]: 2025-10-01 14:37:42.915 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:37:43 compute-0 podman[229766]: 2025-10-01 14:37:43.149512999 +0000 UTC m=+0.063703386 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct 01 14:37:43 compute-0 podman[229767]: 2025-10-01 14:37:43.210056388 +0000 UTC m=+0.113409673 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4)
Oct 01 14:37:44 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Oct 01 14:37:45 compute-0 nova_compute[192698]: 2025-10-01 14:37:45.642 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:37:45 compute-0 nova_compute[192698]: 2025-10-01 14:37:45.644 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:37:45 compute-0 nova_compute[192698]: 2025-10-01 14:37:45.645 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Oct 01 14:37:45 compute-0 nova_compute[192698]: 2025-10-01 14:37:45.645 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Oct 01 14:37:45 compute-0 nova_compute[192698]: 2025-10-01 14:37:45.645 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Oct 01 14:37:45 compute-0 nova_compute[192698]: 2025-10-01 14:37:45.646 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:37:45 compute-0 nova_compute[192698]: 2025-10-01 14:37:45.924 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:37:45 compute-0 nova_compute[192698]: 2025-10-01 14:37:45.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:37:45 compute-0 nova_compute[192698]: 2025-10-01 14:37:45.925 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 01 14:37:50 compute-0 podman[229814]: 2025-10-01 14:37:50.150369366 +0000 UTC m=+0.070297163 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, name=ubi9-minimal, release=1755695350, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-type=git, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.buildah.version=1.33.7, version=9.6)
Oct 01 14:37:50 compute-0 nova_compute[192698]: 2025-10-01 14:37:50.647 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:37:50 compute-0 nova_compute[192698]: 2025-10-01 14:37:50.649 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:37:50 compute-0 nova_compute[192698]: 2025-10-01 14:37:50.649 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Oct 01 14:37:50 compute-0 nova_compute[192698]: 2025-10-01 14:37:50.649 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Oct 01 14:37:50 compute-0 nova_compute[192698]: 2025-10-01 14:37:50.693 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:37:50 compute-0 nova_compute[192698]: 2025-10-01 14:37:50.694 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Oct 01 14:37:55 compute-0 nova_compute[192698]: 2025-10-01 14:37:55.747 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:37:55 compute-0 nova_compute[192698]: 2025-10-01 14:37:55.749 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:37:55 compute-0 nova_compute[192698]: 2025-10-01 14:37:55.749 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5055 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Oct 01 14:37:55 compute-0 nova_compute[192698]: 2025-10-01 14:37:55.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Oct 01 14:37:55 compute-0 nova_compute[192698]: 2025-10-01 14:37:55.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Oct 01 14:37:55 compute-0 nova_compute[192698]: 2025-10-01 14:37:55.751 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:37:56 compute-0 podman[229835]: 2025-10-01 14:37:56.164801444 +0000 UTC m=+0.076123790 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, container_name=iscsid)
Oct 01 14:37:56 compute-0 podman[229836]: 2025-10-01 14:37:56.174090454 +0000 UTC m=+0.078249747 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4)
Oct 01 14:37:56 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:37:56.968 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:05:8a:f6 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-9271916c-5214-4c09-935e-13b34b50b900', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9271916c-5214-4c09-935e-13b34b50b900', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bae9f8c3123c4b158b8c2b37547b3432', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8a4f5a34-42c0-4655-a64c-41091a291e78, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=28b1485d-6c1b-4553-a241-33ad08214b7a) old=Port_Binding(mac=['fa:16:3e:05:8a:f6'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-9271916c-5214-4c09-935e-13b34b50b900', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9271916c-5214-4c09-935e-13b34b50b900', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bae9f8c3123c4b158b8c2b37547b3432', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:37:56 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:37:56.969 103791 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 28b1485d-6c1b-4553-a241-33ad08214b7a in datapath 9271916c-5214-4c09-935e-13b34b50b900 updated
Oct 01 14:37:56 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:37:56.971 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9271916c-5214-4c09-935e-13b34b50b900, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 01 14:37:56 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:37:56.972 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[213be304-246b-487c-a16a-81241a6d0f5f]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:37:59 compute-0 podman[203144]: time="2025-10-01T14:37:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:37:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:37:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:37:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:37:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3027 "" "Go-http-client/1.1"
Oct 01 14:38:00 compute-0 nova_compute[192698]: 2025-10-01 14:38:00.752 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:38:00 compute-0 nova_compute[192698]: 2025-10-01 14:38:00.754 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:38:00 compute-0 nova_compute[192698]: 2025-10-01 14:38:00.755 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Oct 01 14:38:00 compute-0 nova_compute[192698]: 2025-10-01 14:38:00.755 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Oct 01 14:38:00 compute-0 nova_compute[192698]: 2025-10-01 14:38:00.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:38:00 compute-0 nova_compute[192698]: 2025-10-01 14:38:00.786 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Oct 01 14:38:01 compute-0 podman[229875]: 2025-10-01 14:38:01.153996815 +0000 UTC m=+0.072062620 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 01 14:38:01 compute-0 openstack_network_exporter[205307]: ERROR   14:38:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:38:01 compute-0 openstack_network_exporter[205307]: ERROR   14:38:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:38:01 compute-0 openstack_network_exporter[205307]: ERROR   14:38:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:38:01 compute-0 openstack_network_exporter[205307]: ERROR   14:38:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:38:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:38:01 compute-0 openstack_network_exporter[205307]: ERROR   14:38:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:38:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:38:05 compute-0 nova_compute[192698]: 2025-10-01 14:38:05.786 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:38:08 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:38:08.847 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b6:61:c1 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-bd66fdef-a294-43e8-ac5c-f8eccf359bf1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bd66fdef-a294-43e8-ac5c-f8eccf359bf1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6c3ff78cc16a4cf58b183cb67bd03327', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7fc83f94-6040-4d23-956e-8e8e7386619a, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=70e92c02-b4ec-4bc7-9236-e6056614a673) old=Port_Binding(mac=['fa:16:3e:b6:61:c1'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-bd66fdef-a294-43e8-ac5c-f8eccf359bf1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bd66fdef-a294-43e8-ac5c-f8eccf359bf1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6c3ff78cc16a4cf58b183cb67bd03327', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:38:08 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:38:08.847 103791 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 70e92c02-b4ec-4bc7-9236-e6056614a673 in datapath bd66fdef-a294-43e8-ac5c-f8eccf359bf1 updated
Oct 01 14:38:08 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:38:08.848 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bd66fdef-a294-43e8-ac5c-f8eccf359bf1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 01 14:38:08 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:38:08.849 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[f5b45a76-b005-48a2-8b07-3d212b10eef1]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:38:10 compute-0 nova_compute[192698]: 2025-10-01 14:38:10.787 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:38:13 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:38:13.983 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=35, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'e2:3f:3c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:1d:a6:67:ed:e6'}, ipsec=False) old=SB_Global(nb_cfg=34) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:38:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:38:14.020 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 01 14:38:14 compute-0 nova_compute[192698]: 2025-10-01 14:38:14.021 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:38:14 compute-0 podman[229900]: 2025-10-01 14:38:14.150995147 +0000 UTC m=+0.064624680 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, org.label-schema.build-date=20250930, container_name=ovn_metadata_agent)
Oct 01 14:38:14 compute-0 podman[229901]: 2025-10-01 14:38:14.210396116 +0000 UTC m=+0.112355245 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true)
Oct 01 14:38:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:38:14.311 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:38:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:38:14.312 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:38:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:38:14.312 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:38:15 compute-0 nova_compute[192698]: 2025-10-01 14:38:15.790 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:38:20 compute-0 nova_compute[192698]: 2025-10-01 14:38:20.793 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:38:21 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:38:21.022 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=10cf9814-09fa-4bad-879a-270f9b64eda3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '35'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:38:21 compute-0 podman[229945]: 2025-10-01 14:38:21.155890154 +0000 UTC m=+0.075534515 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, distribution-scope=public, architecture=x86_64, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7)
Oct 01 14:38:25 compute-0 nova_compute[192698]: 2025-10-01 14:38:25.794 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:38:25 compute-0 nova_compute[192698]: 2025-10-01 14:38:25.796 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:38:27 compute-0 podman[229967]: 2025-10-01 14:38:27.159177791 +0000 UTC m=+0.073152912 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 01 14:38:27 compute-0 podman[229966]: 2025-10-01 14:38:27.192180439 +0000 UTC m=+0.105995986 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, config_id=iscsid, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Oct 01 14:38:29 compute-0 podman[203144]: time="2025-10-01T14:38:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:38:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:38:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:38:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:38:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3025 "" "Go-http-client/1.1"
Oct 01 14:38:30 compute-0 nova_compute[192698]: 2025-10-01 14:38:30.796 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:38:31 compute-0 openstack_network_exporter[205307]: ERROR   14:38:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:38:31 compute-0 openstack_network_exporter[205307]: ERROR   14:38:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:38:31 compute-0 openstack_network_exporter[205307]: ERROR   14:38:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:38:31 compute-0 openstack_network_exporter[205307]: ERROR   14:38:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:38:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:38:31 compute-0 openstack_network_exporter[205307]: ERROR   14:38:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:38:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:38:32 compute-0 podman[230007]: 2025-10-01 14:38:32.152883524 +0000 UTC m=+0.074856166 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 01 14:38:35 compute-0 nova_compute[192698]: 2025-10-01 14:38:35.798 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:38:36 compute-0 nova_compute[192698]: 2025-10-01 14:38:36.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:38:37 compute-0 nova_compute[192698]: 2025-10-01 14:38:37.437 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:38:37 compute-0 nova_compute[192698]: 2025-10-01 14:38:37.437 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:38:37 compute-0 nova_compute[192698]: 2025-10-01 14:38:37.438 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:38:37 compute-0 nova_compute[192698]: 2025-10-01 14:38:37.438 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 01 14:38:37 compute-0 nova_compute[192698]: 2025-10-01 14:38:37.683 2 WARNING nova.virt.libvirt.driver [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:38:37 compute-0 nova_compute[192698]: 2025-10-01 14:38:37.685 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:38:37 compute-0 nova_compute[192698]: 2025-10-01 14:38:37.718 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.033s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:38:37 compute-0 nova_compute[192698]: 2025-10-01 14:38:37.719 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5865MB free_disk=73.30204010009766GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 01 14:38:37 compute-0 nova_compute[192698]: 2025-10-01 14:38:37.722 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:38:37 compute-0 nova_compute[192698]: 2025-10-01 14:38:37.723 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:38:38 compute-0 nova_compute[192698]: 2025-10-01 14:38:38.831 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 01 14:38:38 compute-0 nova_compute[192698]: 2025-10-01 14:38:38.831 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:38:37 up  1:37,  0 user,  load average: 0.14, 0.15, 0.20\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 01 14:38:38 compute-0 nova_compute[192698]: 2025-10-01 14:38:38.845 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Refreshing inventories for resource provider ee1e54f5-453b-4949-a499-9a192f03b8f0 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Oct 01 14:38:38 compute-0 nova_compute[192698]: 2025-10-01 14:38:38.858 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Updating ProviderTree inventory for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Oct 01 14:38:38 compute-0 nova_compute[192698]: 2025-10-01 14:38:38.858 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Updating inventory in ProviderTree for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Oct 01 14:38:38 compute-0 nova_compute[192698]: 2025-10-01 14:38:38.869 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Refreshing aggregate associations for resource provider ee1e54f5-453b-4949-a499-9a192f03b8f0, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Oct 01 14:38:38 compute-0 nova_compute[192698]: 2025-10-01 14:38:38.887 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Refreshing trait associations for resource provider ee1e54f5-453b-4949-a499-9a192f03b8f0, traits: COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_TPM_TIS,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_BMI2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_AVX2,COMPUTE_ARCH_X86_64,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SHA,COMPUTE_SOUND_MODEL_AC97,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_SOUND_MODEL_ES1370,HW_ARCH_X86_64,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_DEVICE_TAGGING,COMPUTE_SECURITY_TPM_CRB,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SOUND_MODEL_SB16,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SOUND_MODEL_USB,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_ACCELERATORS,HW_CPU_X86_CLMUL,HW_CPU_X86_AESNI,COMPUTE_NODE,HW_CPU_X86_SSSE3,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_RESCUE_BFV,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_FMA3,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_AVX,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_F16C,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_ABM,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_VIRTIO_FS,HW_CPU_X86_SSE2,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE4A,HW_CPU_X86_SVM _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Oct 01 14:38:38 compute-0 nova_compute[192698]: 2025-10-01 14:38:38.910 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:38:39 compute-0 nova_compute[192698]: 2025-10-01 14:38:39.417 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:38:39 compute-0 nova_compute[192698]: 2025-10-01 14:38:39.927 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 01 14:38:39 compute-0 nova_compute[192698]: 2025-10-01 14:38:39.927 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.204s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:38:40 compute-0 nova_compute[192698]: 2025-10-01 14:38:40.799 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:38:40 compute-0 nova_compute[192698]: 2025-10-01 14:38:40.801 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:38:40 compute-0 nova_compute[192698]: 2025-10-01 14:38:40.802 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Oct 01 14:38:40 compute-0 nova_compute[192698]: 2025-10-01 14:38:40.802 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Oct 01 14:38:40 compute-0 nova_compute[192698]: 2025-10-01 14:38:40.828 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:38:40 compute-0 nova_compute[192698]: 2025-10-01 14:38:40.829 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Oct 01 14:38:41 compute-0 nova_compute[192698]: 2025-10-01 14:38:41.928 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:38:41 compute-0 nova_compute[192698]: 2025-10-01 14:38:41.929 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:38:41 compute-0 nova_compute[192698]: 2025-10-01 14:38:41.929 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:38:41 compute-0 nova_compute[192698]: 2025-10-01 14:38:41.929 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:38:42 compute-0 nova_compute[192698]: 2025-10-01 14:38:42.492 2 DEBUG oslo_concurrency.lockutils [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Acquiring lock "fe38f557-1df5-4bed-af03-6c2d4887fc4d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:38:42 compute-0 nova_compute[192698]: 2025-10-01 14:38:42.493 2 DEBUG oslo_concurrency.lockutils [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Lock "fe38f557-1df5-4bed-af03-6c2d4887fc4d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:38:42 compute-0 nova_compute[192698]: 2025-10-01 14:38:42.998 2 DEBUG nova.compute.manager [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Oct 01 14:38:43 compute-0 nova_compute[192698]: 2025-10-01 14:38:43.537 2 DEBUG oslo_concurrency.lockutils [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:38:43 compute-0 nova_compute[192698]: 2025-10-01 14:38:43.537 2 DEBUG oslo_concurrency.lockutils [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:38:43 compute-0 nova_compute[192698]: 2025-10-01 14:38:43.546 2 DEBUG nova.virt.hardware [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Oct 01 14:38:43 compute-0 nova_compute[192698]: 2025-10-01 14:38:43.546 2 INFO nova.compute.claims [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] Claim successful on node compute-0.ctlplane.example.com
Oct 01 14:38:43 compute-0 nova_compute[192698]: 2025-10-01 14:38:43.915 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:38:44 compute-0 nova_compute[192698]: 2025-10-01 14:38:44.594 2 DEBUG nova.compute.provider_tree [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:38:44 compute-0 nova_compute[192698]: 2025-10-01 14:38:44.926 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:38:44 compute-0 nova_compute[192698]: 2025-10-01 14:38:44.926 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Oct 01 14:38:45 compute-0 nova_compute[192698]: 2025-10-01 14:38:45.103 2 DEBUG nova.scheduler.client.report [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:38:45 compute-0 podman[230032]: 2025-10-01 14:38:45.163314125 +0000 UTC m=+0.072910774 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 01 14:38:45 compute-0 podman[230033]: 2025-10-01 14:38:45.215089209 +0000 UTC m=+0.117604767 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct 01 14:38:45 compute-0 nova_compute[192698]: 2025-10-01 14:38:45.615 2 DEBUG oslo_concurrency.lockutils [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.078s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:38:45 compute-0 nova_compute[192698]: 2025-10-01 14:38:45.616 2 DEBUG nova.compute.manager [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Oct 01 14:38:45 compute-0 nova_compute[192698]: 2025-10-01 14:38:45.828 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:38:45 compute-0 nova_compute[192698]: 2025-10-01 14:38:45.829 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:38:46 compute-0 nova_compute[192698]: 2025-10-01 14:38:46.129 2 DEBUG nova.compute.manager [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Oct 01 14:38:46 compute-0 nova_compute[192698]: 2025-10-01 14:38:46.130 2 DEBUG nova.network.neutron [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Oct 01 14:38:46 compute-0 nova_compute[192698]: 2025-10-01 14:38:46.131 2 WARNING neutronclient.v2_0.client [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:38:46 compute-0 nova_compute[192698]: 2025-10-01 14:38:46.131 2 WARNING neutronclient.v2_0.client [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:38:46 compute-0 nova_compute[192698]: 2025-10-01 14:38:46.641 2 INFO nova.virt.libvirt.driver [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Oct 01 14:38:46 compute-0 nova_compute[192698]: 2025-10-01 14:38:46.742 2 DEBUG nova.network.neutron [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] Successfully created port: 61c2e2cb-af2f-4655-8355-5b824716752d _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Oct 01 14:38:47 compute-0 nova_compute[192698]: 2025-10-01 14:38:47.150 2 DEBUG nova.compute.manager [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Oct 01 14:38:47 compute-0 nova_compute[192698]: 2025-10-01 14:38:47.433 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:38:47 compute-0 nova_compute[192698]: 2025-10-01 14:38:47.435 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:38:47 compute-0 nova_compute[192698]: 2025-10-01 14:38:47.435 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 01 14:38:48 compute-0 nova_compute[192698]: 2025-10-01 14:38:48.055 2 DEBUG nova.network.neutron [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] Successfully updated port: 61c2e2cb-af2f-4655-8355-5b824716752d _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Oct 01 14:38:48 compute-0 nova_compute[192698]: 2025-10-01 14:38:48.121 2 DEBUG nova.compute.manager [req-90ce0a4e-8859-42bf-8340-a916047135b7 req-b611adc5-65bd-4acb-882c-e20d1a69ab98 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] Received event network-changed-61c2e2cb-af2f-4655-8355-5b824716752d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:38:48 compute-0 nova_compute[192698]: 2025-10-01 14:38:48.121 2 DEBUG nova.compute.manager [req-90ce0a4e-8859-42bf-8340-a916047135b7 req-b611adc5-65bd-4acb-882c-e20d1a69ab98 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] Refreshing instance network info cache due to event network-changed-61c2e2cb-af2f-4655-8355-5b824716752d. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Oct 01 14:38:48 compute-0 nova_compute[192698]: 2025-10-01 14:38:48.122 2 DEBUG oslo_concurrency.lockutils [req-90ce0a4e-8859-42bf-8340-a916047135b7 req-b611adc5-65bd-4acb-882c-e20d1a69ab98 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "refresh_cache-fe38f557-1df5-4bed-af03-6c2d4887fc4d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 01 14:38:48 compute-0 nova_compute[192698]: 2025-10-01 14:38:48.122 2 DEBUG oslo_concurrency.lockutils [req-90ce0a4e-8859-42bf-8340-a916047135b7 req-b611adc5-65bd-4acb-882c-e20d1a69ab98 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquired lock "refresh_cache-fe38f557-1df5-4bed-af03-6c2d4887fc4d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 01 14:38:48 compute-0 nova_compute[192698]: 2025-10-01 14:38:48.122 2 DEBUG nova.network.neutron [req-90ce0a4e-8859-42bf-8340-a916047135b7 req-b611adc5-65bd-4acb-882c-e20d1a69ab98 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] Refreshing network info cache for port 61c2e2cb-af2f-4655-8355-5b824716752d _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Oct 01 14:38:48 compute-0 nova_compute[192698]: 2025-10-01 14:38:48.169 2 DEBUG nova.compute.manager [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Oct 01 14:38:48 compute-0 nova_compute[192698]: 2025-10-01 14:38:48.172 2 DEBUG nova.virt.libvirt.driver [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Oct 01 14:38:48 compute-0 nova_compute[192698]: 2025-10-01 14:38:48.173 2 INFO nova.virt.libvirt.driver [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] Creating image(s)
Oct 01 14:38:48 compute-0 nova_compute[192698]: 2025-10-01 14:38:48.173 2 DEBUG oslo_concurrency.lockutils [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Acquiring lock "/var/lib/nova/instances/fe38f557-1df5-4bed-af03-6c2d4887fc4d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:38:48 compute-0 nova_compute[192698]: 2025-10-01 14:38:48.174 2 DEBUG oslo_concurrency.lockutils [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Lock "/var/lib/nova/instances/fe38f557-1df5-4bed-af03-6c2d4887fc4d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:38:48 compute-0 nova_compute[192698]: 2025-10-01 14:38:48.175 2 DEBUG oslo_concurrency.lockutils [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Lock "/var/lib/nova/instances/fe38f557-1df5-4bed-af03-6c2d4887fc4d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:38:48 compute-0 nova_compute[192698]: 2025-10-01 14:38:48.175 2 DEBUG oslo_utils.imageutils.format_inspector [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:38:48 compute-0 nova_compute[192698]: 2025-10-01 14:38:48.180 2 DEBUG oslo_utils.imageutils.format_inspector [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:38:48 compute-0 nova_compute[192698]: 2025-10-01 14:38:48.182 2 DEBUG oslo_concurrency.processutils [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:38:48 compute-0 nova_compute[192698]: 2025-10-01 14:38:48.265 2 DEBUG oslo_concurrency.processutils [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:38:48 compute-0 nova_compute[192698]: 2025-10-01 14:38:48.266 2 DEBUG oslo_concurrency.lockutils [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Acquiring lock "f477473ce09fdc00484ca839f539813eb2fee546" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:38:48 compute-0 nova_compute[192698]: 2025-10-01 14:38:48.267 2 DEBUG oslo_concurrency.lockutils [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Lock "f477473ce09fdc00484ca839f539813eb2fee546" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:38:48 compute-0 nova_compute[192698]: 2025-10-01 14:38:48.268 2 DEBUG oslo_utils.imageutils.format_inspector [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:38:48 compute-0 nova_compute[192698]: 2025-10-01 14:38:48.272 2 DEBUG oslo_utils.imageutils.format_inspector [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:38:48 compute-0 nova_compute[192698]: 2025-10-01 14:38:48.272 2 DEBUG oslo_concurrency.processutils [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:38:48 compute-0 nova_compute[192698]: 2025-10-01 14:38:48.357 2 DEBUG oslo_concurrency.processutils [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:38:48 compute-0 nova_compute[192698]: 2025-10-01 14:38:48.359 2 DEBUG oslo_concurrency.processutils [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546,backing_fmt=raw /var/lib/nova/instances/fe38f557-1df5-4bed-af03-6c2d4887fc4d/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:38:48 compute-0 nova_compute[192698]: 2025-10-01 14:38:48.408 2 DEBUG oslo_concurrency.processutils [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546,backing_fmt=raw /var/lib/nova/instances/fe38f557-1df5-4bed-af03-6c2d4887fc4d/disk 1073741824" returned: 0 in 0.049s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:38:48 compute-0 nova_compute[192698]: 2025-10-01 14:38:48.409 2 DEBUG oslo_concurrency.lockutils [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Lock "f477473ce09fdc00484ca839f539813eb2fee546" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.142s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:38:48 compute-0 nova_compute[192698]: 2025-10-01 14:38:48.410 2 DEBUG oslo_concurrency.processutils [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:38:48 compute-0 nova_compute[192698]: 2025-10-01 14:38:48.462 2 DEBUG oslo_concurrency.processutils [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:38:48 compute-0 nova_compute[192698]: 2025-10-01 14:38:48.463 2 DEBUG nova.virt.disk.api [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Checking if we can resize image /var/lib/nova/instances/fe38f557-1df5-4bed-af03-6c2d4887fc4d/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 01 14:38:48 compute-0 nova_compute[192698]: 2025-10-01 14:38:48.464 2 DEBUG oslo_concurrency.processutils [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fe38f557-1df5-4bed-af03-6c2d4887fc4d/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:38:48 compute-0 nova_compute[192698]: 2025-10-01 14:38:48.522 2 DEBUG oslo_concurrency.processutils [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fe38f557-1df5-4bed-af03-6c2d4887fc4d/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:38:48 compute-0 nova_compute[192698]: 2025-10-01 14:38:48.524 2 DEBUG nova.virt.disk.api [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Cannot resize image /var/lib/nova/instances/fe38f557-1df5-4bed-af03-6c2d4887fc4d/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 01 14:38:48 compute-0 nova_compute[192698]: 2025-10-01 14:38:48.524 2 DEBUG nova.virt.libvirt.driver [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Oct 01 14:38:48 compute-0 nova_compute[192698]: 2025-10-01 14:38:48.525 2 DEBUG nova.virt.libvirt.driver [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] Ensure instance console log exists: /var/lib/nova/instances/fe38f557-1df5-4bed-af03-6c2d4887fc4d/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Oct 01 14:38:48 compute-0 nova_compute[192698]: 2025-10-01 14:38:48.525 2 DEBUG oslo_concurrency.lockutils [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:38:48 compute-0 nova_compute[192698]: 2025-10-01 14:38:48.526 2 DEBUG oslo_concurrency.lockutils [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:38:48 compute-0 nova_compute[192698]: 2025-10-01 14:38:48.526 2 DEBUG oslo_concurrency.lockutils [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:38:48 compute-0 nova_compute[192698]: 2025-10-01 14:38:48.562 2 DEBUG oslo_concurrency.lockutils [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Acquiring lock "refresh_cache-fe38f557-1df5-4bed-af03-6c2d4887fc4d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 01 14:38:48 compute-0 nova_compute[192698]: 2025-10-01 14:38:48.632 2 WARNING neutronclient.v2_0.client [req-90ce0a4e-8859-42bf-8340-a916047135b7 req-b611adc5-65bd-4acb-882c-e20d1a69ab98 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:38:48 compute-0 nova_compute[192698]: 2025-10-01 14:38:48.741 2 DEBUG nova.network.neutron [req-90ce0a4e-8859-42bf-8340-a916047135b7 req-b611adc5-65bd-4acb-882c-e20d1a69ab98 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 01 14:38:48 compute-0 nova_compute[192698]: 2025-10-01 14:38:48.906 2 DEBUG nova.network.neutron [req-90ce0a4e-8859-42bf-8340-a916047135b7 req-b611adc5-65bd-4acb-882c-e20d1a69ab98 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:38:49 compute-0 nova_compute[192698]: 2025-10-01 14:38:49.414 2 DEBUG oslo_concurrency.lockutils [req-90ce0a4e-8859-42bf-8340-a916047135b7 req-b611adc5-65bd-4acb-882c-e20d1a69ab98 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Releasing lock "refresh_cache-fe38f557-1df5-4bed-af03-6c2d4887fc4d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 01 14:38:49 compute-0 nova_compute[192698]: 2025-10-01 14:38:49.415 2 DEBUG oslo_concurrency.lockutils [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Acquired lock "refresh_cache-fe38f557-1df5-4bed-af03-6c2d4887fc4d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 01 14:38:49 compute-0 nova_compute[192698]: 2025-10-01 14:38:49.415 2 DEBUG nova.network.neutron [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 01 14:38:50 compute-0 nova_compute[192698]: 2025-10-01 14:38:50.753 2 DEBUG nova.network.neutron [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Oct 01 14:38:50 compute-0 nova_compute[192698]: 2025-10-01 14:38:50.830 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:38:50 compute-0 nova_compute[192698]: 2025-10-01 14:38:50.832 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:38:50 compute-0 nova_compute[192698]: 2025-10-01 14:38:50.977 2 WARNING neutronclient.v2_0.client [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:38:51 compute-0 nova_compute[192698]: 2025-10-01 14:38:51.878 2 DEBUG nova.network.neutron [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] Updating instance_info_cache with network_info: [{"id": "61c2e2cb-af2f-4655-8355-5b824716752d", "address": "fa:16:3e:d1:fd:9f", "network": {"id": "9271916c-5214-4c09-935e-13b34b50b900", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-146524267-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bae9f8c3123c4b158b8c2b37547b3432", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61c2e2cb-af", "ovs_interfaceid": "61c2e2cb-af2f-4655-8355-5b824716752d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:38:52 compute-0 podman[230093]: 2025-10-01 14:38:52.178812448 +0000 UTC m=+0.083222262 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, config_id=edpm, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., name=ubi9-minimal, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public)
Oct 01 14:38:52 compute-0 nova_compute[192698]: 2025-10-01 14:38:52.385 2 DEBUG oslo_concurrency.lockutils [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Releasing lock "refresh_cache-fe38f557-1df5-4bed-af03-6c2d4887fc4d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 01 14:38:52 compute-0 nova_compute[192698]: 2025-10-01 14:38:52.386 2 DEBUG nova.compute.manager [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] Instance network_info: |[{"id": "61c2e2cb-af2f-4655-8355-5b824716752d", "address": "fa:16:3e:d1:fd:9f", "network": {"id": "9271916c-5214-4c09-935e-13b34b50b900", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-146524267-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bae9f8c3123c4b158b8c2b37547b3432", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61c2e2cb-af", "ovs_interfaceid": "61c2e2cb-af2f-4655-8355-5b824716752d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Oct 01 14:38:52 compute-0 nova_compute[192698]: 2025-10-01 14:38:52.391 2 DEBUG nova.virt.libvirt.driver [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] Start _get_guest_xml network_info=[{"id": "61c2e2cb-af2f-4655-8355-5b824716752d", "address": "fa:16:3e:d1:fd:9f", "network": {"id": "9271916c-5214-4c09-935e-13b34b50b900", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-146524267-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bae9f8c3123c4b158b8c2b37547b3432", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61c2e2cb-af", "ovs_interfaceid": "61c2e2cb-af2f-4655-8355-5b824716752d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-01T13:57:39Z,direct_url=<?>,disk_format='qcow2',id=48696e9b-a20d-4bf6-8ac2-6438fe748ab6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9dacac6049d34f02846f752af09ae16f',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-01T13:57:40Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encrypted': False, 'encryption_format': None, 'size': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'boot_index': 0, 'encryption_options': None, 'device_name': '/dev/vda', 'guest_format': None, 'image_id': '48696e9b-a20d-4bf6-8ac2-6438fe748ab6'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Oct 01 14:38:52 compute-0 nova_compute[192698]: 2025-10-01 14:38:52.397 2 WARNING nova.virt.libvirt.driver [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:38:52 compute-0 nova_compute[192698]: 2025-10-01 14:38:52.399 2 DEBUG nova.virt.driver [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='48696e9b-a20d-4bf6-8ac2-6438fe748ab6', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteZoneMigrationStrategy-server-1131882147', uuid='fe38f557-1df5-4bed-af03-6c2d4887fc4d'), owner=OwnerMeta(userid='0a06999394394dbba2b16c054834a1a7', username='tempest-TestExecuteZoneMigrationStrategy-1109679165-project-admin', projectid='6c3ff78cc16a4cf58b183cb67bd03327', projectname='tempest-TestExecuteZoneMigrationStrategy-1109679165'), image=ImageMeta(id='48696e9b-a20d-4bf6-8ac2-6438fe748ab6', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='69702c4b-38f2-49d1-96d5-85671652c67e', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "61c2e2cb-af2f-4655-8355-5b824716752d", "address": "fa:16:3e:d1:fd:9f", "network": {"id": "9271916c-5214-4c09-935e-13b34b50b900", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-146524267-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bae9f8c3123c4b158b8c2b37547b3432", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61c2e2cb-af", "ovs_interfaceid": "61c2e2cb-af2f-4655-8355-5b824716752d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759329532.3989675) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Oct 01 14:38:52 compute-0 nova_compute[192698]: 2025-10-01 14:38:52.405 2 DEBUG nova.virt.libvirt.host [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Oct 01 14:38:52 compute-0 nova_compute[192698]: 2025-10-01 14:38:52.406 2 DEBUG nova.virt.libvirt.host [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Oct 01 14:38:52 compute-0 nova_compute[192698]: 2025-10-01 14:38:52.410 2 DEBUG nova.virt.libvirt.host [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Oct 01 14:38:52 compute-0 nova_compute[192698]: 2025-10-01 14:38:52.411 2 DEBUG nova.virt.libvirt.host [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Oct 01 14:38:52 compute-0 nova_compute[192698]: 2025-10-01 14:38:52.412 2 DEBUG nova.virt.libvirt.driver [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Oct 01 14:38:52 compute-0 nova_compute[192698]: 2025-10-01 14:38:52.412 2 DEBUG nova.virt.hardware [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-01T13:57:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='69702c4b-38f2-49d1-96d5-85671652c67e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-01T13:57:39Z,direct_url=<?>,disk_format='qcow2',id=48696e9b-a20d-4bf6-8ac2-6438fe748ab6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9dacac6049d34f02846f752af09ae16f',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-01T13:57:40Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Oct 01 14:38:52 compute-0 nova_compute[192698]: 2025-10-01 14:38:52.413 2 DEBUG nova.virt.hardware [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Oct 01 14:38:52 compute-0 nova_compute[192698]: 2025-10-01 14:38:52.413 2 DEBUG nova.virt.hardware [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Oct 01 14:38:52 compute-0 nova_compute[192698]: 2025-10-01 14:38:52.414 2 DEBUG nova.virt.hardware [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Oct 01 14:38:52 compute-0 nova_compute[192698]: 2025-10-01 14:38:52.414 2 DEBUG nova.virt.hardware [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Oct 01 14:38:52 compute-0 nova_compute[192698]: 2025-10-01 14:38:52.415 2 DEBUG nova.virt.hardware [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Oct 01 14:38:52 compute-0 nova_compute[192698]: 2025-10-01 14:38:52.415 2 DEBUG nova.virt.hardware [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Oct 01 14:38:52 compute-0 nova_compute[192698]: 2025-10-01 14:38:52.416 2 DEBUG nova.virt.hardware [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Oct 01 14:38:52 compute-0 nova_compute[192698]: 2025-10-01 14:38:52.416 2 DEBUG nova.virt.hardware [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Oct 01 14:38:52 compute-0 nova_compute[192698]: 2025-10-01 14:38:52.416 2 DEBUG nova.virt.hardware [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Oct 01 14:38:52 compute-0 nova_compute[192698]: 2025-10-01 14:38:52.417 2 DEBUG nova.virt.hardware [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Oct 01 14:38:52 compute-0 nova_compute[192698]: 2025-10-01 14:38:52.424 2 DEBUG nova.virt.libvirt.vif [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-01T14:38:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-1131882147',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-1131882147',id=37,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6c3ff78cc16a4cf58b183cb67bd03327',ramdisk_id='',reservation_id='r-fgoxgkfr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,admin,manager',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-1109679165',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-1109679165-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-01T14:38:47Z,user_data=None,user_id='0a06999394394dbba2b16c054834a1a7',uuid=fe38f557-1df5-4bed-af03-6c2d4887fc4d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "61c2e2cb-af2f-4655-8355-5b824716752d", "address": "fa:16:3e:d1:fd:9f", "network": {"id": "9271916c-5214-4c09-935e-13b34b50b900", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-146524267-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bae9f8c3123c4b158b8c2b37547b3432", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61c2e2cb-af", "ovs_interfaceid": "61c2e2cb-af2f-4655-8355-5b824716752d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Oct 01 14:38:52 compute-0 nova_compute[192698]: 2025-10-01 14:38:52.425 2 DEBUG nova.network.os_vif_util [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Converting VIF {"id": "61c2e2cb-af2f-4655-8355-5b824716752d", "address": "fa:16:3e:d1:fd:9f", "network": {"id": "9271916c-5214-4c09-935e-13b34b50b900", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-146524267-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bae9f8c3123c4b158b8c2b37547b3432", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61c2e2cb-af", "ovs_interfaceid": "61c2e2cb-af2f-4655-8355-5b824716752d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:38:52 compute-0 nova_compute[192698]: 2025-10-01 14:38:52.426 2 DEBUG nova.network.os_vif_util [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d1:fd:9f,bridge_name='br-int',has_traffic_filtering=True,id=61c2e2cb-af2f-4655-8355-5b824716752d,network=Network(9271916c-5214-4c09-935e-13b34b50b900),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61c2e2cb-af') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:38:52 compute-0 nova_compute[192698]: 2025-10-01 14:38:52.427 2 DEBUG nova.objects.instance [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Lazy-loading 'pci_devices' on Instance uuid fe38f557-1df5-4bed-af03-6c2d4887fc4d obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:38:52 compute-0 nova_compute[192698]: 2025-10-01 14:38:52.939 2 DEBUG nova.virt.libvirt.driver [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] End _get_guest_xml xml=<domain type="kvm">
Oct 01 14:38:52 compute-0 nova_compute[192698]:   <uuid>fe38f557-1df5-4bed-af03-6c2d4887fc4d</uuid>
Oct 01 14:38:52 compute-0 nova_compute[192698]:   <name>instance-00000025</name>
Oct 01 14:38:52 compute-0 nova_compute[192698]:   <memory>131072</memory>
Oct 01 14:38:52 compute-0 nova_compute[192698]:   <vcpu>1</vcpu>
Oct 01 14:38:52 compute-0 nova_compute[192698]:   <metadata>
Oct 01 14:38:52 compute-0 nova_compute[192698]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct 01 14:38:52 compute-0 nova_compute[192698]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Oct 01 14:38:52 compute-0 nova_compute[192698]:       <nova:name>tempest-TestExecuteZoneMigrationStrategy-server-1131882147</nova:name>
Oct 01 14:38:52 compute-0 nova_compute[192698]:       <nova:creationTime>2025-10-01 14:38:52</nova:creationTime>
Oct 01 14:38:52 compute-0 nova_compute[192698]:       <nova:flavor name="m1.nano" id="69702c4b-38f2-49d1-96d5-85671652c67e">
Oct 01 14:38:52 compute-0 nova_compute[192698]:         <nova:memory>128</nova:memory>
Oct 01 14:38:52 compute-0 nova_compute[192698]:         <nova:disk>1</nova:disk>
Oct 01 14:38:52 compute-0 nova_compute[192698]:         <nova:swap>0</nova:swap>
Oct 01 14:38:52 compute-0 nova_compute[192698]:         <nova:ephemeral>0</nova:ephemeral>
Oct 01 14:38:52 compute-0 nova_compute[192698]:         <nova:vcpus>1</nova:vcpus>
Oct 01 14:38:52 compute-0 nova_compute[192698]:         <nova:extraSpecs>
Oct 01 14:38:52 compute-0 nova_compute[192698]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Oct 01 14:38:52 compute-0 nova_compute[192698]:         </nova:extraSpecs>
Oct 01 14:38:52 compute-0 nova_compute[192698]:       </nova:flavor>
Oct 01 14:38:52 compute-0 nova_compute[192698]:       <nova:image uuid="48696e9b-a20d-4bf6-8ac2-6438fe748ab6">
Oct 01 14:38:52 compute-0 nova_compute[192698]:         <nova:containerFormat>bare</nova:containerFormat>
Oct 01 14:38:52 compute-0 nova_compute[192698]:         <nova:diskFormat>qcow2</nova:diskFormat>
Oct 01 14:38:52 compute-0 nova_compute[192698]:         <nova:minDisk>1</nova:minDisk>
Oct 01 14:38:52 compute-0 nova_compute[192698]:         <nova:minRam>0</nova:minRam>
Oct 01 14:38:52 compute-0 nova_compute[192698]:         <nova:properties>
Oct 01 14:38:52 compute-0 nova_compute[192698]:           <nova:property name="hw_rng_model">virtio</nova:property>
Oct 01 14:38:52 compute-0 nova_compute[192698]:         </nova:properties>
Oct 01 14:38:52 compute-0 nova_compute[192698]:       </nova:image>
Oct 01 14:38:52 compute-0 nova_compute[192698]:       <nova:owner>
Oct 01 14:38:52 compute-0 nova_compute[192698]:         <nova:user uuid="0a06999394394dbba2b16c054834a1a7">tempest-TestExecuteZoneMigrationStrategy-1109679165-project-admin</nova:user>
Oct 01 14:38:52 compute-0 nova_compute[192698]:         <nova:project uuid="6c3ff78cc16a4cf58b183cb67bd03327">tempest-TestExecuteZoneMigrationStrategy-1109679165</nova:project>
Oct 01 14:38:52 compute-0 nova_compute[192698]:       </nova:owner>
Oct 01 14:38:52 compute-0 nova_compute[192698]:       <nova:root type="image" uuid="48696e9b-a20d-4bf6-8ac2-6438fe748ab6"/>
Oct 01 14:38:52 compute-0 nova_compute[192698]:       <nova:ports>
Oct 01 14:38:52 compute-0 nova_compute[192698]:         <nova:port uuid="61c2e2cb-af2f-4655-8355-5b824716752d">
Oct 01 14:38:52 compute-0 nova_compute[192698]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Oct 01 14:38:52 compute-0 nova_compute[192698]:         </nova:port>
Oct 01 14:38:52 compute-0 nova_compute[192698]:       </nova:ports>
Oct 01 14:38:52 compute-0 nova_compute[192698]:     </nova:instance>
Oct 01 14:38:52 compute-0 nova_compute[192698]:   </metadata>
Oct 01 14:38:52 compute-0 nova_compute[192698]:   <sysinfo type="smbios">
Oct 01 14:38:52 compute-0 nova_compute[192698]:     <system>
Oct 01 14:38:52 compute-0 nova_compute[192698]:       <entry name="manufacturer">RDO</entry>
Oct 01 14:38:52 compute-0 nova_compute[192698]:       <entry name="product">OpenStack Compute</entry>
Oct 01 14:38:52 compute-0 nova_compute[192698]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Oct 01 14:38:52 compute-0 nova_compute[192698]:       <entry name="serial">fe38f557-1df5-4bed-af03-6c2d4887fc4d</entry>
Oct 01 14:38:52 compute-0 nova_compute[192698]:       <entry name="uuid">fe38f557-1df5-4bed-af03-6c2d4887fc4d</entry>
Oct 01 14:38:52 compute-0 nova_compute[192698]:       <entry name="family">Virtual Machine</entry>
Oct 01 14:38:52 compute-0 nova_compute[192698]:     </system>
Oct 01 14:38:52 compute-0 nova_compute[192698]:   </sysinfo>
Oct 01 14:38:52 compute-0 nova_compute[192698]:   <os>
Oct 01 14:38:52 compute-0 nova_compute[192698]:     <type arch="x86_64" machine="q35">hvm</type>
Oct 01 14:38:52 compute-0 nova_compute[192698]:     <boot dev="hd"/>
Oct 01 14:38:52 compute-0 nova_compute[192698]:     <smbios mode="sysinfo"/>
Oct 01 14:38:52 compute-0 nova_compute[192698]:   </os>
Oct 01 14:38:52 compute-0 nova_compute[192698]:   <features>
Oct 01 14:38:52 compute-0 nova_compute[192698]:     <acpi/>
Oct 01 14:38:52 compute-0 nova_compute[192698]:     <apic/>
Oct 01 14:38:52 compute-0 nova_compute[192698]:     <vmcoreinfo/>
Oct 01 14:38:52 compute-0 nova_compute[192698]:   </features>
Oct 01 14:38:52 compute-0 nova_compute[192698]:   <clock offset="utc">
Oct 01 14:38:52 compute-0 nova_compute[192698]:     <timer name="pit" tickpolicy="delay"/>
Oct 01 14:38:52 compute-0 nova_compute[192698]:     <timer name="rtc" tickpolicy="catchup"/>
Oct 01 14:38:52 compute-0 nova_compute[192698]:     <timer name="hpet" present="no"/>
Oct 01 14:38:52 compute-0 nova_compute[192698]:   </clock>
Oct 01 14:38:52 compute-0 nova_compute[192698]:   <cpu mode="host-model" match="exact">
Oct 01 14:38:52 compute-0 nova_compute[192698]:     <topology sockets="1" cores="1" threads="1"/>
Oct 01 14:38:52 compute-0 nova_compute[192698]:   </cpu>
Oct 01 14:38:52 compute-0 nova_compute[192698]:   <devices>
Oct 01 14:38:52 compute-0 nova_compute[192698]:     <disk type="file" device="disk">
Oct 01 14:38:52 compute-0 nova_compute[192698]:       <driver name="qemu" type="qcow2" cache="none"/>
Oct 01 14:38:52 compute-0 nova_compute[192698]:       <source file="/var/lib/nova/instances/fe38f557-1df5-4bed-af03-6c2d4887fc4d/disk"/>
Oct 01 14:38:52 compute-0 nova_compute[192698]:       <target dev="vda" bus="virtio"/>
Oct 01 14:38:52 compute-0 nova_compute[192698]:     </disk>
Oct 01 14:38:52 compute-0 nova_compute[192698]:     <disk type="file" device="cdrom">
Oct 01 14:38:52 compute-0 nova_compute[192698]:       <driver name="qemu" type="raw" cache="none"/>
Oct 01 14:38:52 compute-0 nova_compute[192698]:       <source file="/var/lib/nova/instances/fe38f557-1df5-4bed-af03-6c2d4887fc4d/disk.config"/>
Oct 01 14:38:52 compute-0 nova_compute[192698]:       <target dev="sda" bus="sata"/>
Oct 01 14:38:52 compute-0 nova_compute[192698]:     </disk>
Oct 01 14:38:52 compute-0 nova_compute[192698]:     <interface type="ethernet">
Oct 01 14:38:52 compute-0 nova_compute[192698]:       <mac address="fa:16:3e:d1:fd:9f"/>
Oct 01 14:38:52 compute-0 nova_compute[192698]:       <model type="virtio"/>
Oct 01 14:38:52 compute-0 nova_compute[192698]:       <driver name="vhost" rx_queue_size="512"/>
Oct 01 14:38:52 compute-0 nova_compute[192698]:       <mtu size="1442"/>
Oct 01 14:38:52 compute-0 nova_compute[192698]:       <target dev="tap61c2e2cb-af"/>
Oct 01 14:38:52 compute-0 nova_compute[192698]:     </interface>
Oct 01 14:38:52 compute-0 nova_compute[192698]:     <serial type="pty">
Oct 01 14:38:52 compute-0 nova_compute[192698]:       <log file="/var/lib/nova/instances/fe38f557-1df5-4bed-af03-6c2d4887fc4d/console.log" append="off"/>
Oct 01 14:38:52 compute-0 nova_compute[192698]:     </serial>
Oct 01 14:38:52 compute-0 nova_compute[192698]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Oct 01 14:38:52 compute-0 nova_compute[192698]:     <video>
Oct 01 14:38:52 compute-0 nova_compute[192698]:       <model type="virtio"/>
Oct 01 14:38:52 compute-0 nova_compute[192698]:     </video>
Oct 01 14:38:52 compute-0 nova_compute[192698]:     <input type="tablet" bus="usb"/>
Oct 01 14:38:52 compute-0 nova_compute[192698]:     <rng model="virtio">
Oct 01 14:38:52 compute-0 nova_compute[192698]:       <backend model="random">/dev/urandom</backend>
Oct 01 14:38:52 compute-0 nova_compute[192698]:     </rng>
Oct 01 14:38:52 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root"/>
Oct 01 14:38:52 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:38:52 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:38:52 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:38:52 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:38:52 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:38:52 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:38:52 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:38:52 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:38:52 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:38:52 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:38:52 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:38:52 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:38:52 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:38:52 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:38:52 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:38:52 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:38:52 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:38:52 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:38:52 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:38:52 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:38:52 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:38:52 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:38:52 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:38:52 compute-0 nova_compute[192698]:     <controller type="pci" model="pcie-root-port"/>
Oct 01 14:38:52 compute-0 nova_compute[192698]:     <controller type="usb" index="0"/>
Oct 01 14:38:52 compute-0 nova_compute[192698]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Oct 01 14:38:52 compute-0 nova_compute[192698]:       <stats period="10"/>
Oct 01 14:38:52 compute-0 nova_compute[192698]:     </memballoon>
Oct 01 14:38:52 compute-0 nova_compute[192698]:   </devices>
Oct 01 14:38:52 compute-0 nova_compute[192698]: </domain>
Oct 01 14:38:52 compute-0 nova_compute[192698]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Oct 01 14:38:52 compute-0 nova_compute[192698]: 2025-10-01 14:38:52.940 2 DEBUG nova.compute.manager [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] Preparing to wait for external event network-vif-plugged-61c2e2cb-af2f-4655-8355-5b824716752d prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Oct 01 14:38:52 compute-0 nova_compute[192698]: 2025-10-01 14:38:52.941 2 DEBUG oslo_concurrency.lockutils [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Acquiring lock "fe38f557-1df5-4bed-af03-6c2d4887fc4d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:38:52 compute-0 nova_compute[192698]: 2025-10-01 14:38:52.941 2 DEBUG oslo_concurrency.lockutils [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Lock "fe38f557-1df5-4bed-af03-6c2d4887fc4d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:38:52 compute-0 nova_compute[192698]: 2025-10-01 14:38:52.942 2 DEBUG oslo_concurrency.lockutils [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Lock "fe38f557-1df5-4bed-af03-6c2d4887fc4d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:38:52 compute-0 nova_compute[192698]: 2025-10-01 14:38:52.943 2 DEBUG nova.virt.libvirt.vif [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-10-01T14:38:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-1131882147',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-1131882147',id=37,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6c3ff78cc16a4cf58b183cb67bd03327',ramdisk_id='',reservation_id='r-fgoxgkfr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,admin,manager',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-1109679165',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-1109679165-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-01T14:38:47Z,user_data=None,user_id='0a06999394394dbba2b16c054834a1a7',uuid=fe38f557-1df5-4bed-af03-6c2d4887fc4d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "61c2e2cb-af2f-4655-8355-5b824716752d", "address": "fa:16:3e:d1:fd:9f", "network": {"id": "9271916c-5214-4c09-935e-13b34b50b900", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-146524267-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bae9f8c3123c4b158b8c2b37547b3432", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61c2e2cb-af", "ovs_interfaceid": "61c2e2cb-af2f-4655-8355-5b824716752d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 01 14:38:52 compute-0 nova_compute[192698]: 2025-10-01 14:38:52.943 2 DEBUG nova.network.os_vif_util [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Converting VIF {"id": "61c2e2cb-af2f-4655-8355-5b824716752d", "address": "fa:16:3e:d1:fd:9f", "network": {"id": "9271916c-5214-4c09-935e-13b34b50b900", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-146524267-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bae9f8c3123c4b158b8c2b37547b3432", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61c2e2cb-af", "ovs_interfaceid": "61c2e2cb-af2f-4655-8355-5b824716752d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:38:52 compute-0 nova_compute[192698]: 2025-10-01 14:38:52.945 2 DEBUG nova.network.os_vif_util [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d1:fd:9f,bridge_name='br-int',has_traffic_filtering=True,id=61c2e2cb-af2f-4655-8355-5b824716752d,network=Network(9271916c-5214-4c09-935e-13b34b50b900),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61c2e2cb-af') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:38:52 compute-0 nova_compute[192698]: 2025-10-01 14:38:52.945 2 DEBUG os_vif [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d1:fd:9f,bridge_name='br-int',has_traffic_filtering=True,id=61c2e2cb-af2f-4655-8355-5b824716752d,network=Network(9271916c-5214-4c09-935e-13b34b50b900),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61c2e2cb-af') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 01 14:38:52 compute-0 nova_compute[192698]: 2025-10-01 14:38:52.946 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:38:52 compute-0 nova_compute[192698]: 2025-10-01 14:38:52.947 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:38:52 compute-0 nova_compute[192698]: 2025-10-01 14:38:52.947 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:38:52 compute-0 nova_compute[192698]: 2025-10-01 14:38:52.948 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:38:52 compute-0 nova_compute[192698]: 2025-10-01 14:38:52.949 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '58472907-c080-5b5e-b0ad-abc7af2cbb4b', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:38:52 compute-0 nova_compute[192698]: 2025-10-01 14:38:52.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:38:52 compute-0 nova_compute[192698]: 2025-10-01 14:38:52.952 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:38:52 compute-0 nova_compute[192698]: 2025-10-01 14:38:52.956 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:38:52 compute-0 nova_compute[192698]: 2025-10-01 14:38:52.956 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap61c2e2cb-af, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:38:52 compute-0 nova_compute[192698]: 2025-10-01 14:38:52.957 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap61c2e2cb-af, col_values=(('qos', UUID('97ce4b21-5d77-478a-81d2-4d5d84cc24d9')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:38:52 compute-0 nova_compute[192698]: 2025-10-01 14:38:52.957 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap61c2e2cb-af, col_values=(('external_ids', {'iface-id': '61c2e2cb-af2f-4655-8355-5b824716752d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d1:fd:9f', 'vm-uuid': 'fe38f557-1df5-4bed-af03-6c2d4887fc4d'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:38:52 compute-0 nova_compute[192698]: 2025-10-01 14:38:52.958 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:38:52 compute-0 NetworkManager[51741]: <info>  [1759329532.9598] manager: (tap61c2e2cb-af): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/100)
Oct 01 14:38:52 compute-0 nova_compute[192698]: 2025-10-01 14:38:52.961 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:38:52 compute-0 nova_compute[192698]: 2025-10-01 14:38:52.968 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:38:52 compute-0 nova_compute[192698]: 2025-10-01 14:38:52.969 2 INFO os_vif [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d1:fd:9f,bridge_name='br-int',has_traffic_filtering=True,id=61c2e2cb-af2f-4655-8355-5b824716752d,network=Network(9271916c-5214-4c09-935e-13b34b50b900),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61c2e2cb-af')
Oct 01 14:38:54 compute-0 nova_compute[192698]: 2025-10-01 14:38:54.518 2 DEBUG nova.virt.libvirt.driver [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 01 14:38:54 compute-0 nova_compute[192698]: 2025-10-01 14:38:54.519 2 DEBUG nova.virt.libvirt.driver [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Oct 01 14:38:54 compute-0 nova_compute[192698]: 2025-10-01 14:38:54.519 2 DEBUG nova.virt.libvirt.driver [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] No VIF found with MAC fa:16:3e:d1:fd:9f, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Oct 01 14:38:54 compute-0 nova_compute[192698]: 2025-10-01 14:38:54.520 2 INFO nova.virt.libvirt.driver [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] Using config drive
Oct 01 14:38:55 compute-0 nova_compute[192698]: 2025-10-01 14:38:55.036 2 WARNING neutronclient.v2_0.client [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:38:55 compute-0 nova_compute[192698]: 2025-10-01 14:38:55.451 2 INFO nova.virt.libvirt.driver [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] Creating config drive at /var/lib/nova/instances/fe38f557-1df5-4bed-af03-6c2d4887fc4d/disk.config
Oct 01 14:38:55 compute-0 nova_compute[192698]: 2025-10-01 14:38:55.458 2 DEBUG oslo_concurrency.processutils [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/fe38f557-1df5-4bed-af03-6c2d4887fc4d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpsz1qiud2 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:38:55 compute-0 nova_compute[192698]: 2025-10-01 14:38:55.599 2 DEBUG oslo_concurrency.processutils [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/fe38f557-1df5-4bed-af03-6c2d4887fc4d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpsz1qiud2" returned: 0 in 0.141s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:38:55 compute-0 kernel: tap61c2e2cb-af: entered promiscuous mode
Oct 01 14:38:55 compute-0 NetworkManager[51741]: <info>  [1759329535.7015] manager: (tap61c2e2cb-af): new Tun device (/org/freedesktop/NetworkManager/Devices/101)
Oct 01 14:38:55 compute-0 ovn_controller[94909]: 2025-10-01T14:38:55Z|00271|binding|INFO|Claiming lport 61c2e2cb-af2f-4655-8355-5b824716752d for this chassis.
Oct 01 14:38:55 compute-0 ovn_controller[94909]: 2025-10-01T14:38:55Z|00272|binding|INFO|61c2e2cb-af2f-4655-8355-5b824716752d: Claiming fa:16:3e:d1:fd:9f 10.100.0.11
Oct 01 14:38:55 compute-0 nova_compute[192698]: 2025-10-01 14:38:55.701 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:38:55 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:38:55.719 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d1:fd:9f 10.100.0.11'], port_security=['fa:16:3e:d1:fd:9f 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'fe38f557-1df5-4bed-af03-6c2d4887fc4d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9271916c-5214-4c09-935e-13b34b50b900', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6c3ff78cc16a4cf58b183cb67bd03327', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9d9fdca1-a833-4d6c-a710-d80f6740c5cd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8a4f5a34-42c0-4655-a64c-41091a291e78, chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], logical_port=61c2e2cb-af2f-4655-8355-5b824716752d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:38:55 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:38:55.721 103791 INFO neutron.agent.ovn.metadata.agent [-] Port 61c2e2cb-af2f-4655-8355-5b824716752d in datapath 9271916c-5214-4c09-935e-13b34b50b900 bound to our chassis
Oct 01 14:38:55 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:38:55.722 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9271916c-5214-4c09-935e-13b34b50b900
Oct 01 14:38:55 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:38:55.741 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[fa878fb8-ae3d-4815-bf06-f93b9c333283]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:38:55 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:38:55.742 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9271916c-51 in ovnmeta-9271916c-5214-4c09-935e-13b34b50b900 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Oct 01 14:38:55 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:38:55.745 214114 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9271916c-50 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Oct 01 14:38:55 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:38:55.745 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[4cf113fa-ec0f-4b34-bbbe-f7488dcbb965]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:38:55 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:38:55.746 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[94f3ac67-9c3d-44c7-9f9b-3baae64c8444]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:38:55 compute-0 systemd-machined[152704]: New machine qemu-26-instance-00000025.
Oct 01 14:38:55 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:38:55.766 103910 DEBUG oslo.privsep.daemon [-] privsep: reply[33925c44-0102-4c81-9209-2661e54b2254]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:38:55 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:38:55.793 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[1c9dc77d-58db-46bf-aab3-2fe2a0a3b91e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:38:55 compute-0 systemd[1]: Started Virtual Machine qemu-26-instance-00000025.
Oct 01 14:38:55 compute-0 ovn_controller[94909]: 2025-10-01T14:38:55Z|00273|binding|INFO|Setting lport 61c2e2cb-af2f-4655-8355-5b824716752d ovn-installed in OVS
Oct 01 14:38:55 compute-0 ovn_controller[94909]: 2025-10-01T14:38:55Z|00274|binding|INFO|Setting lport 61c2e2cb-af2f-4655-8355-5b824716752d up in Southbound
Oct 01 14:38:55 compute-0 nova_compute[192698]: 2025-10-01 14:38:55.820 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:38:55 compute-0 systemd-udevd[230139]: Network interface NamePolicy= disabled on kernel command line.
Oct 01 14:38:55 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:38:55.828 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[3447ccc4-d158-4b86-9e7b-f96c615cd942]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:38:55 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:38:55.832 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[18c41d3e-5abf-4520-b487-8a02d705d778]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:38:55 compute-0 NetworkManager[51741]: <info>  [1759329535.8337] manager: (tap9271916c-50): new Veth device (/org/freedesktop/NetworkManager/Devices/102)
Oct 01 14:38:55 compute-0 nova_compute[192698]: 2025-10-01 14:38:55.833 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:38:55 compute-0 systemd-udevd[230141]: Network interface NamePolicy= disabled on kernel command line.
Oct 01 14:38:55 compute-0 NetworkManager[51741]: <info>  [1759329535.8424] device (tap61c2e2cb-af): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 01 14:38:55 compute-0 NetworkManager[51741]: <info>  [1759329535.8431] device (tap61c2e2cb-af): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 01 14:38:55 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:38:55.871 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[ae7397b9-eb9e-4a22-9caf-ba256937b45d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:38:55 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:38:55.874 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[bbf66029-081a-49bd-bbd9-7862b0ca2f00]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:38:55 compute-0 NetworkManager[51741]: <info>  [1759329535.9047] device (tap9271916c-50): carrier: link connected
Oct 01 14:38:55 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:38:55.913 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[1dd8f811-604c-40e5-8625-89b90215d450]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:38:55 compute-0 nova_compute[192698]: 2025-10-01 14:38:55.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:38:55 compute-0 nova_compute[192698]: 2025-10-01 14:38:55.926 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Oct 01 14:38:55 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:38:55.934 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[d57040a6-428e-49ad-8f37-6f5ea5fd7e40]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9271916c-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:05:8a:f6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 78], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 589643, 'reachable_time': 37019, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230166, 'error': None, 'target': 'ovnmeta-9271916c-5214-4c09-935e-13b34b50b900', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:38:55 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:38:55.956 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[c5e6d319-4ca6-4909-b75f-38668479eba8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe05:8af6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 589643, 'tstamp': 589643}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230168, 'error': None, 'target': 'ovnmeta-9271916c-5214-4c09-935e-13b34b50b900', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:38:55 compute-0 nova_compute[192698]: 2025-10-01 14:38:55.979 2 DEBUG nova.compute.manager [req-d8b89e7b-54f3-47cb-ac9e-0f40fcadcf97 req-45aa4973-8c1f-4505-81c3-9e3fd4b4811d 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] Received event network-vif-plugged-61c2e2cb-af2f-4655-8355-5b824716752d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:38:55 compute-0 nova_compute[192698]: 2025-10-01 14:38:55.980 2 DEBUG oslo_concurrency.lockutils [req-d8b89e7b-54f3-47cb-ac9e-0f40fcadcf97 req-45aa4973-8c1f-4505-81c3-9e3fd4b4811d 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "fe38f557-1df5-4bed-af03-6c2d4887fc4d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:38:55 compute-0 nova_compute[192698]: 2025-10-01 14:38:55.980 2 DEBUG oslo_concurrency.lockutils [req-d8b89e7b-54f3-47cb-ac9e-0f40fcadcf97 req-45aa4973-8c1f-4505-81c3-9e3fd4b4811d 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "fe38f557-1df5-4bed-af03-6c2d4887fc4d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:38:55 compute-0 nova_compute[192698]: 2025-10-01 14:38:55.981 2 DEBUG oslo_concurrency.lockutils [req-d8b89e7b-54f3-47cb-ac9e-0f40fcadcf97 req-45aa4973-8c1f-4505-81c3-9e3fd4b4811d 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "fe38f557-1df5-4bed-af03-6c2d4887fc4d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:38:55 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:38:55.980 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[9c40f25e-ee01-410a-8687-aa943bdb45ac]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9271916c-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:05:8a:f6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 78], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 589643, 'reachable_time': 37019, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 230169, 'error': None, 'target': 'ovnmeta-9271916c-5214-4c09-935e-13b34b50b900', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:38:55 compute-0 nova_compute[192698]: 2025-10-01 14:38:55.982 2 DEBUG nova.compute.manager [req-d8b89e7b-54f3-47cb-ac9e-0f40fcadcf97 req-45aa4973-8c1f-4505-81c3-9e3fd4b4811d 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] Processing event network-vif-plugged-61c2e2cb-af2f-4655-8355-5b824716752d _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Oct 01 14:38:56 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:38:56.022 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[528e506e-173c-4a8d-907b-93203d6f44ab]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:38:56 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:38:56.120 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[fbfe0878-5a37-47a9-8b32-b15c8ae8b871]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:38:56 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:38:56.122 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9271916c-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:38:56 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:38:56.122 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:38:56 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:38:56.124 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9271916c-50, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:38:56 compute-0 nova_compute[192698]: 2025-10-01 14:38:56.125 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:38:56 compute-0 NetworkManager[51741]: <info>  [1759329536.1271] manager: (tap9271916c-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/103)
Oct 01 14:38:56 compute-0 kernel: tap9271916c-50: entered promiscuous mode
Oct 01 14:38:56 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:38:56.129 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9271916c-50, col_values=(('external_ids', {'iface-id': '28b1485d-6c1b-4553-a241-33ad08214b7a'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:38:56 compute-0 nova_compute[192698]: 2025-10-01 14:38:56.131 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:38:56 compute-0 ovn_controller[94909]: 2025-10-01T14:38:56Z|00275|binding|INFO|Releasing lport 28b1485d-6c1b-4553-a241-33ad08214b7a from this chassis (sb_readonly=0)
Oct 01 14:38:56 compute-0 nova_compute[192698]: 2025-10-01 14:38:56.155 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:38:56 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:38:56.158 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[be2d4831-8baf-43b7-ad2c-8d29a83cfa52]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:38:56 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:38:56.160 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9271916c-5214-4c09-935e-13b34b50b900.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9271916c-5214-4c09-935e-13b34b50b900.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:38:56 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:38:56.160 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9271916c-5214-4c09-935e-13b34b50b900.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9271916c-5214-4c09-935e-13b34b50b900.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:38:56 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:38:56.160 103791 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 9271916c-5214-4c09-935e-13b34b50b900 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Oct 01 14:38:56 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:38:56.160 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9271916c-5214-4c09-935e-13b34b50b900.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9271916c-5214-4c09-935e-13b34b50b900.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:38:56 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:38:56.161 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[ac75f749-2ee6-4824-9351-6be62a1e0cfd]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:38:56 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:38:56.162 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9271916c-5214-4c09-935e-13b34b50b900.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9271916c-5214-4c09-935e-13b34b50b900.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:38:56 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:38:56.163 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[4631352a-2600-48d4-89e9-1d1ef1a86449]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:38:56 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:38:56.164 103791 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Oct 01 14:38:56 compute-0 ovn_metadata_agent[103777]: global
Oct 01 14:38:56 compute-0 ovn_metadata_agent[103777]:     log         /dev/log local0 debug
Oct 01 14:38:56 compute-0 ovn_metadata_agent[103777]:     log-tag     haproxy-metadata-proxy-9271916c-5214-4c09-935e-13b34b50b900
Oct 01 14:38:56 compute-0 ovn_metadata_agent[103777]:     user        root
Oct 01 14:38:56 compute-0 ovn_metadata_agent[103777]:     group       root
Oct 01 14:38:56 compute-0 ovn_metadata_agent[103777]:     maxconn     1024
Oct 01 14:38:56 compute-0 ovn_metadata_agent[103777]:     pidfile     /var/lib/neutron/external/pids/9271916c-5214-4c09-935e-13b34b50b900.pid.haproxy
Oct 01 14:38:56 compute-0 ovn_metadata_agent[103777]:     daemon
Oct 01 14:38:56 compute-0 ovn_metadata_agent[103777]: 
Oct 01 14:38:56 compute-0 ovn_metadata_agent[103777]: defaults
Oct 01 14:38:56 compute-0 ovn_metadata_agent[103777]:     log global
Oct 01 14:38:56 compute-0 ovn_metadata_agent[103777]:     mode http
Oct 01 14:38:56 compute-0 ovn_metadata_agent[103777]:     option httplog
Oct 01 14:38:56 compute-0 ovn_metadata_agent[103777]:     option dontlognull
Oct 01 14:38:56 compute-0 ovn_metadata_agent[103777]:     option http-server-close
Oct 01 14:38:56 compute-0 ovn_metadata_agent[103777]:     option forwardfor
Oct 01 14:38:56 compute-0 ovn_metadata_agent[103777]:     retries                 3
Oct 01 14:38:56 compute-0 ovn_metadata_agent[103777]:     timeout http-request    30s
Oct 01 14:38:56 compute-0 ovn_metadata_agent[103777]:     timeout connect         30s
Oct 01 14:38:56 compute-0 ovn_metadata_agent[103777]:     timeout client          32s
Oct 01 14:38:56 compute-0 ovn_metadata_agent[103777]:     timeout server          32s
Oct 01 14:38:56 compute-0 ovn_metadata_agent[103777]:     timeout http-keep-alive 30s
Oct 01 14:38:56 compute-0 ovn_metadata_agent[103777]: 
Oct 01 14:38:56 compute-0 ovn_metadata_agent[103777]: listen listener
Oct 01 14:38:56 compute-0 ovn_metadata_agent[103777]:     bind 169.254.169.254:80
Oct 01 14:38:56 compute-0 ovn_metadata_agent[103777]:     
Oct 01 14:38:56 compute-0 ovn_metadata_agent[103777]:     server metadata /var/lib/neutron/metadata_proxy
Oct 01 14:38:56 compute-0 ovn_metadata_agent[103777]: 
Oct 01 14:38:56 compute-0 ovn_metadata_agent[103777]:     http-request add-header X-OVN-Network-ID 9271916c-5214-4c09-935e-13b34b50b900
Oct 01 14:38:56 compute-0 ovn_metadata_agent[103777]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Oct 01 14:38:56 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:38:56.165 103791 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9271916c-5214-4c09-935e-13b34b50b900', 'env', 'PROCESS_TAG=haproxy-9271916c-5214-4c09-935e-13b34b50b900', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9271916c-5214-4c09-935e-13b34b50b900.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Oct 01 14:38:56 compute-0 nova_compute[192698]: 2025-10-01 14:38:56.440 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Oct 01 14:38:56 compute-0 podman[230209]: 2025-10-01 14:38:56.633495001 +0000 UTC m=+0.053979124 container create c390d828b405f999f49579075b59fddd9020b7cbc886e9eef07573ddb29e153f (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-9271916c-5214-4c09-935e-13b34b50b900, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Oct 01 14:38:56 compute-0 nova_compute[192698]: 2025-10-01 14:38:56.654 2 DEBUG nova.compute.manager [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Oct 01 14:38:56 compute-0 nova_compute[192698]: 2025-10-01 14:38:56.659 2 DEBUG nova.virt.libvirt.driver [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Oct 01 14:38:56 compute-0 nova_compute[192698]: 2025-10-01 14:38:56.663 2 INFO nova.virt.libvirt.driver [-] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] Instance spawned successfully.
Oct 01 14:38:56 compute-0 nova_compute[192698]: 2025-10-01 14:38:56.664 2 DEBUG nova.virt.libvirt.driver [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Oct 01 14:38:56 compute-0 systemd[1]: Started libpod-conmon-c390d828b405f999f49579075b59fddd9020b7cbc886e9eef07573ddb29e153f.scope.
Oct 01 14:38:56 compute-0 podman[230209]: 2025-10-01 14:38:56.607772349 +0000 UTC m=+0.028256462 image pull 0c139338a67144a0d88e07ef5f38b20d3085af4a1586fd8115d3776c8f9c633c 38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Oct 01 14:38:56 compute-0 systemd[1]: Started libcrun container.
Oct 01 14:38:56 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/412c63c195865ccee0bc02697d5fe32db90fe89db7db68b79dd7578213986241/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct 01 14:38:56 compute-0 podman[230209]: 2025-10-01 14:38:56.746552544 +0000 UTC m=+0.167036668 container init c390d828b405f999f49579075b59fddd9020b7cbc886e9eef07573ddb29e153f (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-9271916c-5214-4c09-935e-13b34b50b900, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 01 14:38:56 compute-0 podman[230209]: 2025-10-01 14:38:56.752527395 +0000 UTC m=+0.173011508 container start c390d828b405f999f49579075b59fddd9020b7cbc886e9eef07573ddb29e153f (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-9271916c-5214-4c09-935e-13b34b50b900, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Oct 01 14:38:56 compute-0 neutron-haproxy-ovnmeta-9271916c-5214-4c09-935e-13b34b50b900[230224]: [NOTICE]   (230228) : New worker (230230) forked
Oct 01 14:38:56 compute-0 neutron-haproxy-ovnmeta-9271916c-5214-4c09-935e-13b34b50b900[230224]: [NOTICE]   (230228) : Loading success.
Oct 01 14:38:56 compute-0 nova_compute[192698]: 2025-10-01 14:38:56.926 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:38:57 compute-0 nova_compute[192698]: 2025-10-01 14:38:57.180 2 DEBUG nova.virt.libvirt.driver [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:38:57 compute-0 nova_compute[192698]: 2025-10-01 14:38:57.181 2 DEBUG nova.virt.libvirt.driver [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:38:57 compute-0 nova_compute[192698]: 2025-10-01 14:38:57.181 2 DEBUG nova.virt.libvirt.driver [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:38:57 compute-0 nova_compute[192698]: 2025-10-01 14:38:57.181 2 DEBUG nova.virt.libvirt.driver [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:38:57 compute-0 nova_compute[192698]: 2025-10-01 14:38:57.182 2 DEBUG nova.virt.libvirt.driver [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:38:57 compute-0 nova_compute[192698]: 2025-10-01 14:38:57.182 2 DEBUG nova.virt.libvirt.driver [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Oct 01 14:38:57 compute-0 nova_compute[192698]: 2025-10-01 14:38:57.693 2 INFO nova.compute.manager [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] Took 9.52 seconds to spawn the instance on the hypervisor.
Oct 01 14:38:57 compute-0 nova_compute[192698]: 2025-10-01 14:38:57.693 2 DEBUG nova.compute.manager [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 01 14:38:57 compute-0 nova_compute[192698]: 2025-10-01 14:38:57.960 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:38:58 compute-0 nova_compute[192698]: 2025-10-01 14:38:58.058 2 DEBUG nova.compute.manager [req-29632f04-cf71-4bd7-bf70-47ef954ab498 req-3451b3c7-2375-433b-8628-d6714db23d25 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] Received event network-vif-plugged-61c2e2cb-af2f-4655-8355-5b824716752d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:38:58 compute-0 nova_compute[192698]: 2025-10-01 14:38:58.059 2 DEBUG oslo_concurrency.lockutils [req-29632f04-cf71-4bd7-bf70-47ef954ab498 req-3451b3c7-2375-433b-8628-d6714db23d25 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "fe38f557-1df5-4bed-af03-6c2d4887fc4d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:38:58 compute-0 nova_compute[192698]: 2025-10-01 14:38:58.059 2 DEBUG oslo_concurrency.lockutils [req-29632f04-cf71-4bd7-bf70-47ef954ab498 req-3451b3c7-2375-433b-8628-d6714db23d25 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "fe38f557-1df5-4bed-af03-6c2d4887fc4d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:38:58 compute-0 nova_compute[192698]: 2025-10-01 14:38:58.060 2 DEBUG oslo_concurrency.lockutils [req-29632f04-cf71-4bd7-bf70-47ef954ab498 req-3451b3c7-2375-433b-8628-d6714db23d25 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "fe38f557-1df5-4bed-af03-6c2d4887fc4d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:38:58 compute-0 nova_compute[192698]: 2025-10-01 14:38:58.060 2 DEBUG nova.compute.manager [req-29632f04-cf71-4bd7-bf70-47ef954ab498 req-3451b3c7-2375-433b-8628-d6714db23d25 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] No waiting events found dispatching network-vif-plugged-61c2e2cb-af2f-4655-8355-5b824716752d pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:38:58 compute-0 nova_compute[192698]: 2025-10-01 14:38:58.061 2 WARNING nova.compute.manager [req-29632f04-cf71-4bd7-bf70-47ef954ab498 req-3451b3c7-2375-433b-8628-d6714db23d25 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] Received unexpected event network-vif-plugged-61c2e2cb-af2f-4655-8355-5b824716752d for instance with vm_state active and task_state None.
Oct 01 14:38:58 compute-0 podman[230239]: 2025-10-01 14:38:58.183858171 +0000 UTC m=+0.090203699 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible, container_name=iscsid, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930)
Oct 01 14:38:58 compute-0 podman[230240]: 2025-10-01 14:38:58.188441354 +0000 UTC m=+0.092104760 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=multipathd, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930)
Oct 01 14:38:58 compute-0 nova_compute[192698]: 2025-10-01 14:38:58.233 2 INFO nova.compute.manager [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] Took 14.73 seconds to build instance.
Oct 01 14:38:58 compute-0 nova_compute[192698]: 2025-10-01 14:38:58.740 2 DEBUG oslo_concurrency.lockutils [None req-fa7c7d85-3fe9-4879-8371-2d67018170f9 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Lock "fe38f557-1df5-4bed-af03-6c2d4887fc4d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.248s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:38:59 compute-0 podman[203144]: time="2025-10-01T14:38:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:38:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:38:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20750 "" "Go-http-client/1.1"
Oct 01 14:38:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:38:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3488 "" "Go-http-client/1.1"
Oct 01 14:39:00 compute-0 nova_compute[192698]: 2025-10-01 14:39:00.835 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:39:01 compute-0 openstack_network_exporter[205307]: ERROR   14:39:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:39:01 compute-0 openstack_network_exporter[205307]: ERROR   14:39:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:39:01 compute-0 openstack_network_exporter[205307]: ERROR   14:39:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:39:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:39:01 compute-0 openstack_network_exporter[205307]: ERROR   14:39:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:39:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:39:01 compute-0 openstack_network_exporter[205307]: ERROR   14:39:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:39:02 compute-0 nova_compute[192698]: 2025-10-01 14:39:02.962 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:39:03 compute-0 podman[230278]: 2025-10-01 14:39:03.170696249 +0000 UTC m=+0.078867644 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 01 14:39:05 compute-0 nova_compute[192698]: 2025-10-01 14:39:05.838 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:39:07 compute-0 nova_compute[192698]: 2025-10-01 14:39:07.966 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:39:09 compute-0 ovn_controller[94909]: 2025-10-01T14:39:09Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d1:fd:9f 10.100.0.11
Oct 01 14:39:09 compute-0 ovn_controller[94909]: 2025-10-01T14:39:09Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d1:fd:9f 10.100.0.11
Oct 01 14:39:10 compute-0 nova_compute[192698]: 2025-10-01 14:39:10.264 2 DEBUG nova.virt.libvirt.driver [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9ee46d61-9bf8-493f-a8e0-254e41a9deb8] Creating tmpfile /var/lib/nova/instances/tmp1v36ml30 to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Oct 01 14:39:10 compute-0 nova_compute[192698]: 2025-10-01 14:39:10.265 2 WARNING neutronclient.v2_0.client [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:39:10 compute-0 nova_compute[192698]: 2025-10-01 14:39:10.276 2 DEBUG nova.compute.manager [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp1v36ml30',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Oct 01 14:39:10 compute-0 nova_compute[192698]: 2025-10-01 14:39:10.840 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:39:12 compute-0 nova_compute[192698]: 2025-10-01 14:39:12.317 2 WARNING neutronclient.v2_0.client [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:39:12 compute-0 nova_compute[192698]: 2025-10-01 14:39:12.969 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:39:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:39:14.314 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:39:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:39:14.315 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:39:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:39:14.316 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:39:15 compute-0 nova_compute[192698]: 2025-10-01 14:39:15.844 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:39:16 compute-0 podman[230323]: 2025-10-01 14:39:16.158944584 +0000 UTC m=+0.068242988 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent)
Oct 01 14:39:16 compute-0 podman[230324]: 2025-10-01 14:39:16.198981441 +0000 UTC m=+0.101269837 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_controller, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Oct 01 14:39:16 compute-0 nova_compute[192698]: 2025-10-01 14:39:16.614 2 DEBUG nova.compute.manager [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp1v36ml30',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='9ee46d61-9bf8-493f-a8e0-254e41a9deb8',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Oct 01 14:39:17 compute-0 nova_compute[192698]: 2025-10-01 14:39:17.631 2 DEBUG oslo_concurrency.lockutils [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "refresh_cache-9ee46d61-9bf8-493f-a8e0-254e41a9deb8" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 01 14:39:17 compute-0 nova_compute[192698]: 2025-10-01 14:39:17.632 2 DEBUG oslo_concurrency.lockutils [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquired lock "refresh_cache-9ee46d61-9bf8-493f-a8e0-254e41a9deb8" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 01 14:39:17 compute-0 nova_compute[192698]: 2025-10-01 14:39:17.633 2 DEBUG nova.network.neutron [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9ee46d61-9bf8-493f-a8e0-254e41a9deb8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 01 14:39:17 compute-0 nova_compute[192698]: 2025-10-01 14:39:17.972 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:39:18 compute-0 nova_compute[192698]: 2025-10-01 14:39:18.141 2 WARNING neutronclient.v2_0.client [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:39:19 compute-0 nova_compute[192698]: 2025-10-01 14:39:19.305 2 WARNING neutronclient.v2_0.client [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:39:19 compute-0 nova_compute[192698]: 2025-10-01 14:39:19.474 2 DEBUG nova.network.neutron [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9ee46d61-9bf8-493f-a8e0-254e41a9deb8] Updating instance_info_cache with network_info: [{"id": "b26a270a-4dc0-4403-b36c-0b5edd823d71", "address": "fa:16:3e:4a:b0:23", "network": {"id": "9271916c-5214-4c09-935e-13b34b50b900", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-146524267-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bae9f8c3123c4b158b8c2b37547b3432", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb26a270a-4d", "ovs_interfaceid": "b26a270a-4dc0-4403-b36c-0b5edd823d71", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:39:19 compute-0 nova_compute[192698]: 2025-10-01 14:39:19.982 2 DEBUG oslo_concurrency.lockutils [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Releasing lock "refresh_cache-9ee46d61-9bf8-493f-a8e0-254e41a9deb8" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 01 14:39:20 compute-0 nova_compute[192698]: 2025-10-01 14:39:19.999 2 DEBUG nova.virt.libvirt.driver [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9ee46d61-9bf8-493f-a8e0-254e41a9deb8] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp1v36ml30',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='9ee46d61-9bf8-493f-a8e0-254e41a9deb8',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Oct 01 14:39:20 compute-0 nova_compute[192698]: 2025-10-01 14:39:20.000 2 DEBUG nova.virt.libvirt.driver [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9ee46d61-9bf8-493f-a8e0-254e41a9deb8] Creating instance directory: /var/lib/nova/instances/9ee46d61-9bf8-493f-a8e0-254e41a9deb8 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Oct 01 14:39:20 compute-0 nova_compute[192698]: 2025-10-01 14:39:20.001 2 DEBUG nova.virt.libvirt.driver [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9ee46d61-9bf8-493f-a8e0-254e41a9deb8] Creating disk.info with the contents: {'/var/lib/nova/instances/9ee46d61-9bf8-493f-a8e0-254e41a9deb8/disk': 'qcow2', '/var/lib/nova/instances/9ee46d61-9bf8-493f-a8e0-254e41a9deb8/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Oct 01 14:39:20 compute-0 nova_compute[192698]: 2025-10-01 14:39:20.002 2 DEBUG nova.virt.libvirt.driver [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9ee46d61-9bf8-493f-a8e0-254e41a9deb8] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Oct 01 14:39:20 compute-0 nova_compute[192698]: 2025-10-01 14:39:20.003 2 DEBUG nova.objects.instance [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 9ee46d61-9bf8-493f-a8e0-254e41a9deb8 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:39:20 compute-0 nova_compute[192698]: 2025-10-01 14:39:20.511 2 DEBUG oslo_utils.imageutils.format_inspector [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:39:20 compute-0 nova_compute[192698]: 2025-10-01 14:39:20.519 2 DEBUG oslo_utils.imageutils.format_inspector [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:39:20 compute-0 nova_compute[192698]: 2025-10-01 14:39:20.525 2 DEBUG oslo_concurrency.processutils [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:39:20 compute-0 nova_compute[192698]: 2025-10-01 14:39:20.619 2 DEBUG oslo_concurrency.processutils [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:39:20 compute-0 nova_compute[192698]: 2025-10-01 14:39:20.621 2 DEBUG oslo_concurrency.lockutils [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "f477473ce09fdc00484ca839f539813eb2fee546" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:39:20 compute-0 nova_compute[192698]: 2025-10-01 14:39:20.621 2 DEBUG oslo_concurrency.lockutils [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "f477473ce09fdc00484ca839f539813eb2fee546" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:39:20 compute-0 nova_compute[192698]: 2025-10-01 14:39:20.622 2 DEBUG oslo_utils.imageutils.format_inspector [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:39:20 compute-0 nova_compute[192698]: 2025-10-01 14:39:20.627 2 DEBUG oslo_utils.imageutils.format_inspector [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Oct 01 14:39:20 compute-0 nova_compute[192698]: 2025-10-01 14:39:20.628 2 DEBUG oslo_concurrency.processutils [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:39:20 compute-0 nova_compute[192698]: 2025-10-01 14:39:20.702 2 DEBUG oslo_concurrency.processutils [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:39:20 compute-0 nova_compute[192698]: 2025-10-01 14:39:20.703 2 DEBUG oslo_concurrency.processutils [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546,backing_fmt=raw /var/lib/nova/instances/9ee46d61-9bf8-493f-a8e0-254e41a9deb8/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:39:20 compute-0 nova_compute[192698]: 2025-10-01 14:39:20.742 2 DEBUG oslo_concurrency.processutils [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546,backing_fmt=raw /var/lib/nova/instances/9ee46d61-9bf8-493f-a8e0-254e41a9deb8/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:39:20 compute-0 nova_compute[192698]: 2025-10-01 14:39:20.744 2 DEBUG oslo_concurrency.lockutils [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "f477473ce09fdc00484ca839f539813eb2fee546" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.122s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:39:20 compute-0 nova_compute[192698]: 2025-10-01 14:39:20.745 2 DEBUG oslo_concurrency.processutils [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:39:20 compute-0 nova_compute[192698]: 2025-10-01 14:39:20.807 2 DEBUG oslo_concurrency.processutils [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f477473ce09fdc00484ca839f539813eb2fee546 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:39:20 compute-0 nova_compute[192698]: 2025-10-01 14:39:20.809 2 DEBUG nova.virt.disk.api [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Checking if we can resize image /var/lib/nova/instances/9ee46d61-9bf8-493f-a8e0-254e41a9deb8/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Oct 01 14:39:20 compute-0 nova_compute[192698]: 2025-10-01 14:39:20.810 2 DEBUG oslo_concurrency.processutils [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9ee46d61-9bf8-493f-a8e0-254e41a9deb8/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:39:20 compute-0 nova_compute[192698]: 2025-10-01 14:39:20.847 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:39:20 compute-0 nova_compute[192698]: 2025-10-01 14:39:20.889 2 DEBUG oslo_concurrency.processutils [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9ee46d61-9bf8-493f-a8e0-254e41a9deb8/disk --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:39:20 compute-0 nova_compute[192698]: 2025-10-01 14:39:20.891 2 DEBUG nova.virt.disk.api [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Cannot resize image /var/lib/nova/instances/9ee46d61-9bf8-493f-a8e0-254e41a9deb8/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Oct 01 14:39:20 compute-0 nova_compute[192698]: 2025-10-01 14:39:20.891 2 DEBUG nova.objects.instance [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lazy-loading 'migration_context' on Instance uuid 9ee46d61-9bf8-493f-a8e0-254e41a9deb8 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:39:21 compute-0 nova_compute[192698]: 2025-10-01 14:39:21.402 2 DEBUG nova.objects.base [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Object Instance<9ee46d61-9bf8-493f-a8e0-254e41a9deb8> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Oct 01 14:39:21 compute-0 nova_compute[192698]: 2025-10-01 14:39:21.403 2 DEBUG oslo_concurrency.processutils [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/9ee46d61-9bf8-493f-a8e0-254e41a9deb8/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:39:21 compute-0 nova_compute[192698]: 2025-10-01 14:39:21.434 2 DEBUG oslo_concurrency.processutils [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/9ee46d61-9bf8-493f-a8e0-254e41a9deb8/disk.config 497664" returned: 0 in 0.030s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:39:21 compute-0 nova_compute[192698]: 2025-10-01 14:39:21.435 2 DEBUG nova.virt.libvirt.driver [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9ee46d61-9bf8-493f-a8e0-254e41a9deb8] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Oct 01 14:39:21 compute-0 nova_compute[192698]: 2025-10-01 14:39:21.436 2 DEBUG nova.virt.libvirt.vif [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-10-01T14:38:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-1169481367',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-1169481367',id=36,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-01T14:38:37Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='6c3ff78cc16a4cf58b183cb67bd03327',ramdisk_id='',reservation_id='r-vchyp5cq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,admin,manager',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-1109679165',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-1109679165-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-10-01T14:38:37Z,user_data=None,user_id='0a06999394394dbba2b16c054834a1a7',uuid=9ee46d61-9bf8-493f-a8e0-254e41a9deb8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b26a270a-4dc0-4403-b36c-0b5edd823d71", "address": "fa:16:3e:4a:b0:23", "network": {"id": "9271916c-5214-4c09-935e-13b34b50b900", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-146524267-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bae9f8c3123c4b158b8c2b37547b3432", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapb26a270a-4d", "ovs_interfaceid": "b26a270a-4dc0-4403-b36c-0b5edd823d71", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Oct 01 14:39:21 compute-0 nova_compute[192698]: 2025-10-01 14:39:21.437 2 DEBUG nova.network.os_vif_util [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Converting VIF {"id": "b26a270a-4dc0-4403-b36c-0b5edd823d71", "address": "fa:16:3e:4a:b0:23", "network": {"id": "9271916c-5214-4c09-935e-13b34b50b900", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-146524267-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bae9f8c3123c4b158b8c2b37547b3432", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapb26a270a-4d", "ovs_interfaceid": "b26a270a-4dc0-4403-b36c-0b5edd823d71", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:39:21 compute-0 nova_compute[192698]: 2025-10-01 14:39:21.438 2 DEBUG nova.network.os_vif_util [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4a:b0:23,bridge_name='br-int',has_traffic_filtering=True,id=b26a270a-4dc0-4403-b36c-0b5edd823d71,network=Network(9271916c-5214-4c09-935e-13b34b50b900),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb26a270a-4d') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:39:21 compute-0 nova_compute[192698]: 2025-10-01 14:39:21.438 2 DEBUG os_vif [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4a:b0:23,bridge_name='br-int',has_traffic_filtering=True,id=b26a270a-4dc0-4403-b36c-0b5edd823d71,network=Network(9271916c-5214-4c09-935e-13b34b50b900),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb26a270a-4d') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Oct 01 14:39:21 compute-0 nova_compute[192698]: 2025-10-01 14:39:21.439 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:39:21 compute-0 nova_compute[192698]: 2025-10-01 14:39:21.440 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:39:21 compute-0 nova_compute[192698]: 2025-10-01 14:39:21.441 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:39:21 compute-0 nova_compute[192698]: 2025-10-01 14:39:21.442 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:39:21 compute-0 nova_compute[192698]: 2025-10-01 14:39:21.442 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'd8e95baf-3c3f-53ce-849a-a16b860d6d33', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:39:21 compute-0 nova_compute[192698]: 2025-10-01 14:39:21.443 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:39:21 compute-0 nova_compute[192698]: 2025-10-01 14:39:21.446 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:39:21 compute-0 nova_compute[192698]: 2025-10-01 14:39:21.449 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:39:21 compute-0 nova_compute[192698]: 2025-10-01 14:39:21.449 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb26a270a-4d, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:39:21 compute-0 nova_compute[192698]: 2025-10-01 14:39:21.450 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapb26a270a-4d, col_values=(('qos', UUID('3feee353-d3f3-4d11-bd3b-aa81880f244f')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:39:21 compute-0 nova_compute[192698]: 2025-10-01 14:39:21.450 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapb26a270a-4d, col_values=(('external_ids', {'iface-id': 'b26a270a-4dc0-4403-b36c-0b5edd823d71', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4a:b0:23', 'vm-uuid': '9ee46d61-9bf8-493f-a8e0-254e41a9deb8'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:39:21 compute-0 nova_compute[192698]: 2025-10-01 14:39:21.452 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:39:21 compute-0 NetworkManager[51741]: <info>  [1759329561.4538] manager: (tapb26a270a-4d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/104)
Oct 01 14:39:21 compute-0 nova_compute[192698]: 2025-10-01 14:39:21.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:39:21 compute-0 nova_compute[192698]: 2025-10-01 14:39:21.467 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:39:21 compute-0 nova_compute[192698]: 2025-10-01 14:39:21.469 2 INFO os_vif [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4a:b0:23,bridge_name='br-int',has_traffic_filtering=True,id=b26a270a-4dc0-4403-b36c-0b5edd823d71,network=Network(9271916c-5214-4c09-935e-13b34b50b900),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb26a270a-4d')
Oct 01 14:39:21 compute-0 nova_compute[192698]: 2025-10-01 14:39:21.469 2 DEBUG nova.virt.libvirt.driver [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Oct 01 14:39:21 compute-0 nova_compute[192698]: 2025-10-01 14:39:21.469 2 DEBUG nova.compute.manager [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp1v36ml30',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='9ee46d61-9bf8-493f-a8e0-254e41a9deb8',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Oct 01 14:39:21 compute-0 nova_compute[192698]: 2025-10-01 14:39:21.470 2 WARNING neutronclient.v2_0.client [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:39:21 compute-0 nova_compute[192698]: 2025-10-01 14:39:21.761 2 WARNING neutronclient.v2_0.client [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:39:22 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:39:22.045 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=36, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'e2:3f:3c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:1d:a6:67:ed:e6'}, ipsec=False) old=SB_Global(nb_cfg=35) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:39:22 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:39:22.046 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 01 14:39:22 compute-0 nova_compute[192698]: 2025-10-01 14:39:22.046 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:39:22 compute-0 nova_compute[192698]: 2025-10-01 14:39:22.380 2 DEBUG nova.network.neutron [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9ee46d61-9bf8-493f-a8e0-254e41a9deb8] Port b26a270a-4dc0-4403-b36c-0b5edd823d71 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Oct 01 14:39:22 compute-0 nova_compute[192698]: 2025-10-01 14:39:22.399 2 DEBUG nova.compute.manager [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp1v36ml30',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='9ee46d61-9bf8-493f-a8e0-254e41a9deb8',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Oct 01 14:39:23 compute-0 podman[230387]: 2025-10-01 14:39:23.186551641 +0000 UTC m=+0.089130540 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, release=1755695350, vcs-type=git, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, name=ubi9-minimal, version=9.6, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7)
Oct 01 14:39:25 compute-0 systemd[1]: Starting libvirt proxy daemon...
Oct 01 14:39:25 compute-0 systemd[1]: Started libvirt proxy daemon.
Oct 01 14:39:25 compute-0 nova_compute[192698]: 2025-10-01 14:39:25.852 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:39:25 compute-0 kernel: tapb26a270a-4d: entered promiscuous mode
Oct 01 14:39:25 compute-0 NetworkManager[51741]: <info>  [1759329565.9846] manager: (tapb26a270a-4d): new Tun device (/org/freedesktop/NetworkManager/Devices/105)
Oct 01 14:39:25 compute-0 nova_compute[192698]: 2025-10-01 14:39:25.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:39:25 compute-0 ovn_controller[94909]: 2025-10-01T14:39:25Z|00276|binding|INFO|Claiming lport b26a270a-4dc0-4403-b36c-0b5edd823d71 for this additional chassis.
Oct 01 14:39:25 compute-0 ovn_controller[94909]: 2025-10-01T14:39:25Z|00277|binding|INFO|b26a270a-4dc0-4403-b36c-0b5edd823d71: Claiming fa:16:3e:4a:b0:23 10.100.0.4
Oct 01 14:39:25 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:39:25.995 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4a:b0:23 10.100.0.4'], port_security=['fa:16:3e:4a:b0:23 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '9ee46d61-9bf8-493f-a8e0-254e41a9deb8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9271916c-5214-4c09-935e-13b34b50b900', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6c3ff78cc16a4cf58b183cb67bd03327', 'neutron:revision_number': '10', 'neutron:security_group_ids': '9d9fdca1-a833-4d6c-a710-d80f6740c5cd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8a4f5a34-42c0-4655-a64c-41091a291e78, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=b26a270a-4dc0-4403-b36c-0b5edd823d71) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:39:25 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:39:25.996 103791 INFO neutron.agent.ovn.metadata.agent [-] Port b26a270a-4dc0-4403-b36c-0b5edd823d71 in datapath 9271916c-5214-4c09-935e-13b34b50b900 unbound from our chassis
Oct 01 14:39:25 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:39:25.998 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9271916c-5214-4c09-935e-13b34b50b900
Oct 01 14:39:26 compute-0 ovn_controller[94909]: 2025-10-01T14:39:26Z|00278|binding|INFO|Setting lport b26a270a-4dc0-4403-b36c-0b5edd823d71 ovn-installed in OVS
Oct 01 14:39:26 compute-0 nova_compute[192698]: 2025-10-01 14:39:26.004 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:39:26 compute-0 nova_compute[192698]: 2025-10-01 14:39:26.005 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:39:26 compute-0 nova_compute[192698]: 2025-10-01 14:39:26.010 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:39:26 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:39:26.021 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[823dfa8c-1ae8-43a1-95ca-3ba0b5cafef4]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:39:26 compute-0 systemd-udevd[230440]: Network interface NamePolicy= disabled on kernel command line.
Oct 01 14:39:26 compute-0 systemd-machined[152704]: New machine qemu-27-instance-00000024.
Oct 01 14:39:26 compute-0 NetworkManager[51741]: <info>  [1759329566.0512] device (tapb26a270a-4d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct 01 14:39:26 compute-0 NetworkManager[51741]: <info>  [1759329566.0536] device (tapb26a270a-4d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct 01 14:39:26 compute-0 systemd[1]: Started Virtual Machine qemu-27-instance-00000024.
Oct 01 14:39:26 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:39:26.076 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[403d98b9-14a3-4fcf-badb-ed5f5f6089e0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:39:26 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:39:26.081 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[49c7bb42-8383-4be7-8f16-860445484bcb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:39:26 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:39:26.126 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[8f56a965-6ce2-4c8c-bebf-0541ffddb9a3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:39:26 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:39:26.153 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[70dda53d-f255-42e5-9eb2-b75866209e8c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9271916c-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:05:8a:f6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 78], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 589643, 'reachable_time': 37019, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230452, 'error': None, 'target': 'ovnmeta-9271916c-5214-4c09-935e-13b34b50b900', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:39:26 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:39:26.182 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[90d3523f-d314-4936-af4e-e7cc3327041e]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap9271916c-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 589659, 'tstamp': 589659}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230454, 'error': None, 'target': 'ovnmeta-9271916c-5214-4c09-935e-13b34b50b900', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap9271916c-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 589664, 'tstamp': 589664}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230454, 'error': None, 'target': 'ovnmeta-9271916c-5214-4c09-935e-13b34b50b900', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:39:26 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:39:26.184 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9271916c-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:39:26 compute-0 nova_compute[192698]: 2025-10-01 14:39:26.186 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:39:26 compute-0 nova_compute[192698]: 2025-10-01 14:39:26.189 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:39:26 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:39:26.189 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9271916c-50, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:39:26 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:39:26.189 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:39:26 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:39:26.190 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9271916c-50, col_values=(('external_ids', {'iface-id': '28b1485d-6c1b-4553-a241-33ad08214b7a'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:39:26 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:39:26.190 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:39:26 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:39:26.192 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[8be57d82-02b3-4b35-af99-30b714b34f65]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-9271916c-5214-4c09-935e-13b34b50b900\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/9271916c-5214-4c09-935e-13b34b50b900.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 9271916c-5214-4c09-935e-13b34b50b900\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:39:26 compute-0 nova_compute[192698]: 2025-10-01 14:39:26.453 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:39:28 compute-0 ovn_controller[94909]: 2025-10-01T14:39:28Z|00279|binding|INFO|Claiming lport b26a270a-4dc0-4403-b36c-0b5edd823d71 for this chassis.
Oct 01 14:39:28 compute-0 ovn_controller[94909]: 2025-10-01T14:39:28Z|00280|binding|INFO|b26a270a-4dc0-4403-b36c-0b5edd823d71: Claiming fa:16:3e:4a:b0:23 10.100.0.4
Oct 01 14:39:28 compute-0 ovn_controller[94909]: 2025-10-01T14:39:28Z|00281|binding|INFO|Setting lport b26a270a-4dc0-4403-b36c-0b5edd823d71 up in Southbound
Oct 01 14:39:29 compute-0 podman[230477]: 2025-10-01 14:39:29.205574461 +0000 UTC m=+0.109836407 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=iscsid, container_name=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4)
Oct 01 14:39:29 compute-0 podman[230478]: 2025-10-01 14:39:29.228857498 +0000 UTC m=+0.131415718 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 01 14:39:29 compute-0 nova_compute[192698]: 2025-10-01 14:39:29.722 2 INFO nova.compute.manager [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9ee46d61-9bf8-493f-a8e0-254e41a9deb8] Post operation of migration started
Oct 01 14:39:29 compute-0 nova_compute[192698]: 2025-10-01 14:39:29.723 2 WARNING neutronclient.v2_0.client [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:39:29 compute-0 podman[203144]: time="2025-10-01T14:39:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:39:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:39:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20750 "" "Go-http-client/1.1"
Oct 01 14:39:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:39:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3499 "" "Go-http-client/1.1"
Oct 01 14:39:29 compute-0 nova_compute[192698]: 2025-10-01 14:39:29.812 2 WARNING neutronclient.v2_0.client [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:39:29 compute-0 nova_compute[192698]: 2025-10-01 14:39:29.812 2 WARNING neutronclient.v2_0.client [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:39:29 compute-0 nova_compute[192698]: 2025-10-01 14:39:29.896 2 DEBUG oslo_concurrency.lockutils [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "refresh_cache-9ee46d61-9bf8-493f-a8e0-254e41a9deb8" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Oct 01 14:39:29 compute-0 nova_compute[192698]: 2025-10-01 14:39:29.896 2 DEBUG oslo_concurrency.lockutils [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquired lock "refresh_cache-9ee46d61-9bf8-493f-a8e0-254e41a9deb8" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Oct 01 14:39:29 compute-0 nova_compute[192698]: 2025-10-01 14:39:29.896 2 DEBUG nova.network.neutron [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9ee46d61-9bf8-493f-a8e0-254e41a9deb8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Oct 01 14:39:30 compute-0 nova_compute[192698]: 2025-10-01 14:39:30.402 2 WARNING neutronclient.v2_0.client [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:39:30 compute-0 nova_compute[192698]: 2025-10-01 14:39:30.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:39:31 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:39:31.048 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=10cf9814-09fa-4bad-879a-270f9b64eda3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '36'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:39:31 compute-0 nova_compute[192698]: 2025-10-01 14:39:31.135 2 WARNING neutronclient.v2_0.client [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:39:31 compute-0 nova_compute[192698]: 2025-10-01 14:39:31.278 2 DEBUG nova.network.neutron [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9ee46d61-9bf8-493f-a8e0-254e41a9deb8] Updating instance_info_cache with network_info: [{"id": "b26a270a-4dc0-4403-b36c-0b5edd823d71", "address": "fa:16:3e:4a:b0:23", "network": {"id": "9271916c-5214-4c09-935e-13b34b50b900", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-146524267-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bae9f8c3123c4b158b8c2b37547b3432", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb26a270a-4d", "ovs_interfaceid": "b26a270a-4dc0-4403-b36c-0b5edd823d71", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:39:31 compute-0 openstack_network_exporter[205307]: ERROR   14:39:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:39:31 compute-0 openstack_network_exporter[205307]: ERROR   14:39:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:39:31 compute-0 openstack_network_exporter[205307]: ERROR   14:39:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:39:31 compute-0 openstack_network_exporter[205307]: ERROR   14:39:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:39:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:39:31 compute-0 openstack_network_exporter[205307]: ERROR   14:39:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:39:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:39:31 compute-0 nova_compute[192698]: 2025-10-01 14:39:31.454 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:39:31 compute-0 nova_compute[192698]: 2025-10-01 14:39:31.786 2 DEBUG oslo_concurrency.lockutils [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Releasing lock "refresh_cache-9ee46d61-9bf8-493f-a8e0-254e41a9deb8" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Oct 01 14:39:32 compute-0 nova_compute[192698]: 2025-10-01 14:39:32.310 2 DEBUG oslo_concurrency.lockutils [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:39:32 compute-0 nova_compute[192698]: 2025-10-01 14:39:32.311 2 DEBUG oslo_concurrency.lockutils [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:39:32 compute-0 nova_compute[192698]: 2025-10-01 14:39:32.312 2 DEBUG oslo_concurrency.lockutils [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:39:32 compute-0 nova_compute[192698]: 2025-10-01 14:39:32.319 2 INFO nova.virt.libvirt.driver [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9ee46d61-9bf8-493f-a8e0-254e41a9deb8] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Oct 01 14:39:32 compute-0 virtqemud[192597]: Domain id=27 name='instance-00000024' uuid=9ee46d61-9bf8-493f-a8e0-254e41a9deb8 is tainted: custom-monitor
Oct 01 14:39:33 compute-0 nova_compute[192698]: 2025-10-01 14:39:33.331 2 INFO nova.virt.libvirt.driver [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9ee46d61-9bf8-493f-a8e0-254e41a9deb8] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Oct 01 14:39:34 compute-0 podman[230517]: 2025-10-01 14:39:34.156867462 +0000 UTC m=+0.073273394 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 01 14:39:34 compute-0 nova_compute[192698]: 2025-10-01 14:39:34.337 2 INFO nova.virt.libvirt.driver [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9ee46d61-9bf8-493f-a8e0-254e41a9deb8] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Oct 01 14:39:34 compute-0 nova_compute[192698]: 2025-10-01 14:39:34.343 2 DEBUG nova.compute.manager [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9ee46d61-9bf8-493f-a8e0-254e41a9deb8] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Oct 01 14:39:34 compute-0 nova_compute[192698]: 2025-10-01 14:39:34.855 2 DEBUG nova.objects.instance [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9ee46d61-9bf8-493f-a8e0-254e41a9deb8] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Oct 01 14:39:35 compute-0 nova_compute[192698]: 2025-10-01 14:39:35.873 2 WARNING neutronclient.v2_0.client [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:39:35 compute-0 nova_compute[192698]: 2025-10-01 14:39:35.903 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:39:36 compute-0 nova_compute[192698]: 2025-10-01 14:39:36.016 2 WARNING neutronclient.v2_0.client [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:39:36 compute-0 nova_compute[192698]: 2025-10-01 14:39:36.016 2 WARNING neutronclient.v2_0.client [None req-4b2b288b-b6ec-4be4-8140-b53eb19cb896 a0e4a5121b0e4d579c8f593495c74d31 31dc4633e7f44f51a82d0af792f040b8 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:39:36 compute-0 nova_compute[192698]: 2025-10-01 14:39:36.457 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:39:38 compute-0 nova_compute[192698]: 2025-10-01 14:39:38.072 2 DEBUG oslo_concurrency.lockutils [None req-c5b37bb3-ec49-4bb7-bba4-8247bc7c12c6 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Acquiring lock "fe38f557-1df5-4bed-af03-6c2d4887fc4d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:39:38 compute-0 nova_compute[192698]: 2025-10-01 14:39:38.073 2 DEBUG oslo_concurrency.lockutils [None req-c5b37bb3-ec49-4bb7-bba4-8247bc7c12c6 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Lock "fe38f557-1df5-4bed-af03-6c2d4887fc4d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:39:38 compute-0 nova_compute[192698]: 2025-10-01 14:39:38.073 2 DEBUG oslo_concurrency.lockutils [None req-c5b37bb3-ec49-4bb7-bba4-8247bc7c12c6 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Acquiring lock "fe38f557-1df5-4bed-af03-6c2d4887fc4d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:39:38 compute-0 nova_compute[192698]: 2025-10-01 14:39:38.074 2 DEBUG oslo_concurrency.lockutils [None req-c5b37bb3-ec49-4bb7-bba4-8247bc7c12c6 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Lock "fe38f557-1df5-4bed-af03-6c2d4887fc4d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:39:38 compute-0 nova_compute[192698]: 2025-10-01 14:39:38.074 2 DEBUG oslo_concurrency.lockutils [None req-c5b37bb3-ec49-4bb7-bba4-8247bc7c12c6 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Lock "fe38f557-1df5-4bed-af03-6c2d4887fc4d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:39:38 compute-0 nova_compute[192698]: 2025-10-01 14:39:38.091 2 INFO nova.compute.manager [None req-c5b37bb3-ec49-4bb7-bba4-8247bc7c12c6 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] Terminating instance
Oct 01 14:39:38 compute-0 nova_compute[192698]: 2025-10-01 14:39:38.431 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:39:38 compute-0 nova_compute[192698]: 2025-10-01 14:39:38.609 2 DEBUG nova.compute.manager [None req-c5b37bb3-ec49-4bb7-bba4-8247bc7c12c6 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 01 14:39:38 compute-0 kernel: tap61c2e2cb-af (unregistering): left promiscuous mode
Oct 01 14:39:38 compute-0 NetworkManager[51741]: <info>  [1759329578.6377] device (tap61c2e2cb-af): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 01 14:39:38 compute-0 ovn_controller[94909]: 2025-10-01T14:39:38Z|00282|binding|INFO|Releasing lport 61c2e2cb-af2f-4655-8355-5b824716752d from this chassis (sb_readonly=0)
Oct 01 14:39:38 compute-0 nova_compute[192698]: 2025-10-01 14:39:38.657 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:39:38 compute-0 ovn_controller[94909]: 2025-10-01T14:39:38Z|00283|binding|INFO|Setting lport 61c2e2cb-af2f-4655-8355-5b824716752d down in Southbound
Oct 01 14:39:38 compute-0 ovn_controller[94909]: 2025-10-01T14:39:38Z|00284|binding|INFO|Removing iface tap61c2e2cb-af ovn-installed in OVS
Oct 01 14:39:38 compute-0 nova_compute[192698]: 2025-10-01 14:39:38.661 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:39:38 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:39:38.670 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d1:fd:9f 10.100.0.11'], port_security=['fa:16:3e:d1:fd:9f 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'fe38f557-1df5-4bed-af03-6c2d4887fc4d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9271916c-5214-4c09-935e-13b34b50b900', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6c3ff78cc16a4cf58b183cb67bd03327', 'neutron:revision_number': '5', 'neutron:security_group_ids': '9d9fdca1-a833-4d6c-a710-d80f6740c5cd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8a4f5a34-42c0-4655-a64c-41091a291e78, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], logical_port=61c2e2cb-af2f-4655-8355-5b824716752d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:39:38 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:39:38.671 103791 INFO neutron.agent.ovn.metadata.agent [-] Port 61c2e2cb-af2f-4655-8355-5b824716752d in datapath 9271916c-5214-4c09-935e-13b34b50b900 unbound from our chassis
Oct 01 14:39:38 compute-0 nova_compute[192698]: 2025-10-01 14:39:38.672 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:39:38 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:39:38.674 103791 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9271916c-5214-4c09-935e-13b34b50b900
Oct 01 14:39:38 compute-0 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d00000025.scope: Deactivated successfully.
Oct 01 14:39:38 compute-0 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d00000025.scope: Consumed 14.196s CPU time.
Oct 01 14:39:38 compute-0 systemd-machined[152704]: Machine qemu-26-instance-00000025 terminated.
Oct 01 14:39:38 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:39:38.700 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[11c0ebba-7f2f-45b2-8343-35eca6349716]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:39:38 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:39:38.746 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[16321eb0-3af5-4265-9d21-f2e9be8ec99f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:39:38 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:39:38.749 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[907d3f7a-2886-458a-9602-d07c24f2ab51]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:39:38 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:39:38.790 215767 DEBUG oslo.privsep.daemon [-] privsep: reply[e558d0d6-6fe0-4f99-afa7-461da0f13e9c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:39:38 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:39:38.818 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[c3bfcff7-2e55-41e5-b4bc-152cd5597df9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9271916c-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:05:8a:f6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 78], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 589643, 'reachable_time': 37019, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230553, 'error': None, 'target': 'ovnmeta-9271916c-5214-4c09-935e-13b34b50b900', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:39:38 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:39:38.849 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[d8409077-6fcd-4371-947c-288a3aae60f7]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap9271916c-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 589659, 'tstamp': 589659}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230555, 'error': None, 'target': 'ovnmeta-9271916c-5214-4c09-935e-13b34b50b900', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap9271916c-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 589664, 'tstamp': 589664}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230555, 'error': None, 'target': 'ovnmeta-9271916c-5214-4c09-935e-13b34b50b900', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:39:38 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:39:38.850 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9271916c-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:39:38 compute-0 nova_compute[192698]: 2025-10-01 14:39:38.852 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:39:38 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:39:38.858 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9271916c-50, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:39:38 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:39:38.858 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:39:38 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:39:38.858 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9271916c-50, col_values=(('external_ids', {'iface-id': '28b1485d-6c1b-4553-a241-33ad08214b7a'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:39:38 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:39:38.858 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Oct 01 14:39:38 compute-0 nova_compute[192698]: 2025-10-01 14:39:38.859 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:39:38 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:39:38.860 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[a0ffadab-30e0-4460-a359-de4060c8535f]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-9271916c-5214-4c09-935e-13b34b50b900\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/9271916c-5214-4c09-935e-13b34b50b900.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 9271916c-5214-4c09-935e-13b34b50b900\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:39:38 compute-0 nova_compute[192698]: 2025-10-01 14:39:38.892 2 DEBUG nova.compute.manager [req-0baeb1ec-e251-4258-a306-aaeca511554c req-1fb7af0b-8c1e-4494-a448-e907c02ec8e4 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] Received event network-vif-unplugged-61c2e2cb-af2f-4655-8355-5b824716752d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:39:38 compute-0 nova_compute[192698]: 2025-10-01 14:39:38.892 2 DEBUG oslo_concurrency.lockutils [req-0baeb1ec-e251-4258-a306-aaeca511554c req-1fb7af0b-8c1e-4494-a448-e907c02ec8e4 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "fe38f557-1df5-4bed-af03-6c2d4887fc4d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:39:38 compute-0 nova_compute[192698]: 2025-10-01 14:39:38.893 2 DEBUG oslo_concurrency.lockutils [req-0baeb1ec-e251-4258-a306-aaeca511554c req-1fb7af0b-8c1e-4494-a448-e907c02ec8e4 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "fe38f557-1df5-4bed-af03-6c2d4887fc4d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:39:38 compute-0 nova_compute[192698]: 2025-10-01 14:39:38.893 2 DEBUG oslo_concurrency.lockutils [req-0baeb1ec-e251-4258-a306-aaeca511554c req-1fb7af0b-8c1e-4494-a448-e907c02ec8e4 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "fe38f557-1df5-4bed-af03-6c2d4887fc4d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:39:38 compute-0 nova_compute[192698]: 2025-10-01 14:39:38.893 2 DEBUG nova.compute.manager [req-0baeb1ec-e251-4258-a306-aaeca511554c req-1fb7af0b-8c1e-4494-a448-e907c02ec8e4 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] No waiting events found dispatching network-vif-unplugged-61c2e2cb-af2f-4655-8355-5b824716752d pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:39:38 compute-0 nova_compute[192698]: 2025-10-01 14:39:38.893 2 DEBUG nova.compute.manager [req-0baeb1ec-e251-4258-a306-aaeca511554c req-1fb7af0b-8c1e-4494-a448-e907c02ec8e4 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] Received event network-vif-unplugged-61c2e2cb-af2f-4655-8355-5b824716752d for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 01 14:39:38 compute-0 nova_compute[192698]: 2025-10-01 14:39:38.897 2 INFO nova.virt.libvirt.driver [-] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] Instance destroyed successfully.
Oct 01 14:39:38 compute-0 nova_compute[192698]: 2025-10-01 14:39:38.897 2 DEBUG nova.objects.instance [None req-c5b37bb3-ec49-4bb7-bba4-8247bc7c12c6 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Lazy-loading 'resources' on Instance uuid fe38f557-1df5-4bed-af03-6c2d4887fc4d obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:39:38 compute-0 nova_compute[192698]: 2025-10-01 14:39:38.946 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:39:38 compute-0 nova_compute[192698]: 2025-10-01 14:39:38.947 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:39:38 compute-0 nova_compute[192698]: 2025-10-01 14:39:38.947 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:39:38 compute-0 nova_compute[192698]: 2025-10-01 14:39:38.947 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 01 14:39:39 compute-0 nova_compute[192698]: 2025-10-01 14:39:39.403 2 DEBUG nova.virt.libvirt.vif [None req-c5b37bb3-ec49-4bb7-bba4-8247bc7c12c6 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-10-01T14:38:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-1131882147',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-1131882147',id=37,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-01T14:38:57Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6c3ff78cc16a4cf58b183cb67bd03327',ramdisk_id='',reservation_id='r-fgoxgkfr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,admin,manager',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-1109679165',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-1109679165-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-01T14:38:57Z,user_data=None,user_id='0a06999394394dbba2b16c054834a1a7',uuid=fe38f557-1df5-4bed-af03-6c2d4887fc4d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "61c2e2cb-af2f-4655-8355-5b824716752d", "address": "fa:16:3e:d1:fd:9f", "network": {"id": "9271916c-5214-4c09-935e-13b34b50b900", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-146524267-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bae9f8c3123c4b158b8c2b37547b3432", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61c2e2cb-af", "ovs_interfaceid": "61c2e2cb-af2f-4655-8355-5b824716752d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 01 14:39:39 compute-0 nova_compute[192698]: 2025-10-01 14:39:39.404 2 DEBUG nova.network.os_vif_util [None req-c5b37bb3-ec49-4bb7-bba4-8247bc7c12c6 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Converting VIF {"id": "61c2e2cb-af2f-4655-8355-5b824716752d", "address": "fa:16:3e:d1:fd:9f", "network": {"id": "9271916c-5214-4c09-935e-13b34b50b900", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-146524267-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bae9f8c3123c4b158b8c2b37547b3432", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61c2e2cb-af", "ovs_interfaceid": "61c2e2cb-af2f-4655-8355-5b824716752d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:39:39 compute-0 nova_compute[192698]: 2025-10-01 14:39:39.405 2 DEBUG nova.network.os_vif_util [None req-c5b37bb3-ec49-4bb7-bba4-8247bc7c12c6 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d1:fd:9f,bridge_name='br-int',has_traffic_filtering=True,id=61c2e2cb-af2f-4655-8355-5b824716752d,network=Network(9271916c-5214-4c09-935e-13b34b50b900),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61c2e2cb-af') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:39:39 compute-0 nova_compute[192698]: 2025-10-01 14:39:39.406 2 DEBUG os_vif [None req-c5b37bb3-ec49-4bb7-bba4-8247bc7c12c6 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d1:fd:9f,bridge_name='br-int',has_traffic_filtering=True,id=61c2e2cb-af2f-4655-8355-5b824716752d,network=Network(9271916c-5214-4c09-935e-13b34b50b900),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61c2e2cb-af') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 01 14:39:39 compute-0 nova_compute[192698]: 2025-10-01 14:39:39.409 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:39:39 compute-0 nova_compute[192698]: 2025-10-01 14:39:39.410 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap61c2e2cb-af, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:39:39 compute-0 nova_compute[192698]: 2025-10-01 14:39:39.411 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:39:39 compute-0 nova_compute[192698]: 2025-10-01 14:39:39.413 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:39:39 compute-0 nova_compute[192698]: 2025-10-01 14:39:39.414 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:39:39 compute-0 nova_compute[192698]: 2025-10-01 14:39:39.415 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=97ce4b21-5d77-478a-81d2-4d5d84cc24d9) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:39:39 compute-0 nova_compute[192698]: 2025-10-01 14:39:39.416 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:39:39 compute-0 nova_compute[192698]: 2025-10-01 14:39:39.417 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:39:39 compute-0 nova_compute[192698]: 2025-10-01 14:39:39.419 2 INFO os_vif [None req-c5b37bb3-ec49-4bb7-bba4-8247bc7c12c6 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d1:fd:9f,bridge_name='br-int',has_traffic_filtering=True,id=61c2e2cb-af2f-4655-8355-5b824716752d,network=Network(9271916c-5214-4c09-935e-13b34b50b900),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61c2e2cb-af')
Oct 01 14:39:39 compute-0 nova_compute[192698]: 2025-10-01 14:39:39.420 2 INFO nova.virt.libvirt.driver [None req-c5b37bb3-ec49-4bb7-bba4-8247bc7c12c6 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] Deleting instance files /var/lib/nova/instances/fe38f557-1df5-4bed-af03-6c2d4887fc4d_del
Oct 01 14:39:39 compute-0 nova_compute[192698]: 2025-10-01 14:39:39.421 2 INFO nova.virt.libvirt.driver [None req-c5b37bb3-ec49-4bb7-bba4-8247bc7c12c6 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] Deletion of /var/lib/nova/instances/fe38f557-1df5-4bed-af03-6c2d4887fc4d_del complete
Oct 01 14:39:39 compute-0 nova_compute[192698]: 2025-10-01 14:39:39.935 2 INFO nova.compute.manager [None req-c5b37bb3-ec49-4bb7-bba4-8247bc7c12c6 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] Took 1.33 seconds to destroy the instance on the hypervisor.
Oct 01 14:39:39 compute-0 nova_compute[192698]: 2025-10-01 14:39:39.936 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-c5b37bb3-ec49-4bb7-bba4-8247bc7c12c6 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 01 14:39:39 compute-0 nova_compute[192698]: 2025-10-01 14:39:39.937 2 DEBUG nova.compute.manager [-] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 01 14:39:39 compute-0 nova_compute[192698]: 2025-10-01 14:39:39.937 2 DEBUG nova.network.neutron [-] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 01 14:39:39 compute-0 nova_compute[192698]: 2025-10-01 14:39:39.938 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:39:40 compute-0 nova_compute[192698]: 2025-10-01 14:39:40.004 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9ee46d61-9bf8-493f-a8e0-254e41a9deb8/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:39:40 compute-0 nova_compute[192698]: 2025-10-01 14:39:40.097 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9ee46d61-9bf8-493f-a8e0-254e41a9deb8/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:39:40 compute-0 nova_compute[192698]: 2025-10-01 14:39:40.098 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9ee46d61-9bf8-493f-a8e0-254e41a9deb8/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:39:40 compute-0 nova_compute[192698]: 2025-10-01 14:39:40.164 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9ee46d61-9bf8-493f-a8e0-254e41a9deb8/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:39:40 compute-0 nova_compute[192698]: 2025-10-01 14:39:40.166 2 WARNING nova.virt.libvirt.driver [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Error from libvirt while getting description of instance-00000025: [Error Code 42] Domain not found: no domain with matching uuid 'fe38f557-1df5-4bed-af03-6c2d4887fc4d' (instance-00000025): libvirt.libvirtError: Domain not found: no domain with matching uuid 'fe38f557-1df5-4bed-af03-6c2d4887fc4d' (instance-00000025)
Oct 01 14:39:40 compute-0 nova_compute[192698]: 2025-10-01 14:39:40.341 2 WARNING nova.virt.libvirt.driver [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:39:40 compute-0 nova_compute[192698]: 2025-10-01 14:39:40.342 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:39:40 compute-0 nova_compute[192698]: 2025-10-01 14:39:40.371 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.029s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:39:40 compute-0 nova_compute[192698]: 2025-10-01 14:39:40.372 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5670MB free_disk=73.24382400512695GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 01 14:39:40 compute-0 nova_compute[192698]: 2025-10-01 14:39:40.373 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:39:40 compute-0 nova_compute[192698]: 2025-10-01 14:39:40.373 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:39:40 compute-0 nova_compute[192698]: 2025-10-01 14:39:40.769 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:39:40 compute-0 nova_compute[192698]: 2025-10-01 14:39:40.906 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:39:40 compute-0 nova_compute[192698]: 2025-10-01 14:39:40.976 2 DEBUG nova.compute.manager [req-54c82aac-ab95-4b81-9a70-695704644503 req-9f00f245-f57f-4dc8-abe9-1543e76bcf38 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] Received event network-vif-unplugged-61c2e2cb-af2f-4655-8355-5b824716752d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:39:40 compute-0 nova_compute[192698]: 2025-10-01 14:39:40.976 2 DEBUG oslo_concurrency.lockutils [req-54c82aac-ab95-4b81-9a70-695704644503 req-9f00f245-f57f-4dc8-abe9-1543e76bcf38 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "fe38f557-1df5-4bed-af03-6c2d4887fc4d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:39:40 compute-0 nova_compute[192698]: 2025-10-01 14:39:40.976 2 DEBUG oslo_concurrency.lockutils [req-54c82aac-ab95-4b81-9a70-695704644503 req-9f00f245-f57f-4dc8-abe9-1543e76bcf38 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "fe38f557-1df5-4bed-af03-6c2d4887fc4d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:39:40 compute-0 nova_compute[192698]: 2025-10-01 14:39:40.977 2 DEBUG oslo_concurrency.lockutils [req-54c82aac-ab95-4b81-9a70-695704644503 req-9f00f245-f57f-4dc8-abe9-1543e76bcf38 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "fe38f557-1df5-4bed-af03-6c2d4887fc4d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:39:40 compute-0 nova_compute[192698]: 2025-10-01 14:39:40.977 2 DEBUG nova.compute.manager [req-54c82aac-ab95-4b81-9a70-695704644503 req-9f00f245-f57f-4dc8-abe9-1543e76bcf38 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] No waiting events found dispatching network-vif-unplugged-61c2e2cb-af2f-4655-8355-5b824716752d pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:39:40 compute-0 nova_compute[192698]: 2025-10-01 14:39:40.977 2 DEBUG nova.compute.manager [req-54c82aac-ab95-4b81-9a70-695704644503 req-9f00f245-f57f-4dc8-abe9-1543e76bcf38 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] Received event network-vif-unplugged-61c2e2cb-af2f-4655-8355-5b824716752d for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 01 14:39:41 compute-0 nova_compute[192698]: 2025-10-01 14:39:41.464 2 DEBUG nova.compute.manager [req-e59267dc-fc94-4e00-8b80-b4a1eb623941 req-f765835f-a277-4b43-830f-4b1371985e59 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] Received event network-vif-deleted-61c2e2cb-af2f-4655-8355-5b824716752d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:39:41 compute-0 nova_compute[192698]: 2025-10-01 14:39:41.464 2 INFO nova.compute.manager [req-e59267dc-fc94-4e00-8b80-b4a1eb623941 req-f765835f-a277-4b43-830f-4b1371985e59 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] Neutron deleted interface 61c2e2cb-af2f-4655-8355-5b824716752d; detaching it from the instance and deleting it from the info cache
Oct 01 14:39:41 compute-0 nova_compute[192698]: 2025-10-01 14:39:41.464 2 DEBUG nova.network.neutron [req-e59267dc-fc94-4e00-8b80-b4a1eb623941 req-f765835f-a277-4b43-830f-4b1371985e59 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:39:41 compute-0 nova_compute[192698]: 2025-10-01 14:39:41.877 2 DEBUG nova.network.neutron [-] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:39:41 compute-0 nova_compute[192698]: 2025-10-01 14:39:41.972 2 DEBUG nova.compute.manager [req-e59267dc-fc94-4e00-8b80-b4a1eb623941 req-f765835f-a277-4b43-830f-4b1371985e59 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] Detach interface failed, port_id=61c2e2cb-af2f-4655-8355-5b824716752d, reason: Instance fe38f557-1df5-4bed-af03-6c2d4887fc4d could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Oct 01 14:39:41 compute-0 nova_compute[192698]: 2025-10-01 14:39:41.990 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Instance fe38f557-1df5-4bed-af03-6c2d4887fc4d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Oct 01 14:39:41 compute-0 nova_compute[192698]: 2025-10-01 14:39:41.990 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Instance 9ee46d61-9bf8-493f-a8e0-254e41a9deb8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Oct 01 14:39:41 compute-0 nova_compute[192698]: 2025-10-01 14:39:41.990 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 01 14:39:41 compute-0 nova_compute[192698]: 2025-10-01 14:39:41.991 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:39:40 up  1:39,  0 user,  load average: 0.34, 0.21, 0.22\n', 'num_instances': '2', 'num_vm_active': '2', 'num_task_None': '1', 'num_os_type_None': '2', 'num_proj_6c3ff78cc16a4cf58b183cb67bd03327': '2', 'io_workload': '0', 'num_task_deleting': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 01 14:39:42 compute-0 nova_compute[192698]: 2025-10-01 14:39:42.105 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:39:42 compute-0 nova_compute[192698]: 2025-10-01 14:39:42.383 2 INFO nova.compute.manager [-] [instance: fe38f557-1df5-4bed-af03-6c2d4887fc4d] Took 2.45 seconds to deallocate network for instance.
Oct 01 14:39:42 compute-0 nova_compute[192698]: 2025-10-01 14:39:42.613 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:39:42 compute-0 nova_compute[192698]: 2025-10-01 14:39:42.908 2 DEBUG oslo_concurrency.lockutils [None req-c5b37bb3-ec49-4bb7-bba4-8247bc7c12c6 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:39:43 compute-0 nova_compute[192698]: 2025-10-01 14:39:43.124 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 01 14:39:43 compute-0 nova_compute[192698]: 2025-10-01 14:39:43.124 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.751s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:39:43 compute-0 nova_compute[192698]: 2025-10-01 14:39:43.125 2 DEBUG oslo_concurrency.lockutils [None req-c5b37bb3-ec49-4bb7-bba4-8247bc7c12c6 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.216s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:39:43 compute-0 nova_compute[192698]: 2025-10-01 14:39:43.202 2 DEBUG nova.compute.provider_tree [None req-c5b37bb3-ec49-4bb7-bba4-8247bc7c12c6 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:39:43 compute-0 nova_compute[192698]: 2025-10-01 14:39:43.711 2 DEBUG nova.scheduler.client.report [None req-c5b37bb3-ec49-4bb7-bba4-8247bc7c12c6 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:39:44 compute-0 nova_compute[192698]: 2025-10-01 14:39:44.226 2 DEBUG oslo_concurrency.lockutils [None req-c5b37bb3-ec49-4bb7-bba4-8247bc7c12c6 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.101s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:39:44 compute-0 nova_compute[192698]: 2025-10-01 14:39:44.257 2 INFO nova.scheduler.client.report [None req-c5b37bb3-ec49-4bb7-bba4-8247bc7c12c6 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Deleted allocations for instance fe38f557-1df5-4bed-af03-6c2d4887fc4d
Oct 01 14:39:44 compute-0 nova_compute[192698]: 2025-10-01 14:39:44.416 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:39:44 compute-0 nova_compute[192698]: 2025-10-01 14:39:44.618 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:39:45 compute-0 nova_compute[192698]: 2025-10-01 14:39:45.132 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:39:45 compute-0 nova_compute[192698]: 2025-10-01 14:39:45.133 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:39:45 compute-0 nova_compute[192698]: 2025-10-01 14:39:45.133 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:39:45 compute-0 nova_compute[192698]: 2025-10-01 14:39:45.133 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:39:45 compute-0 nova_compute[192698]: 2025-10-01 14:39:45.285 2 DEBUG oslo_concurrency.lockutils [None req-c5b37bb3-ec49-4bb7-bba4-8247bc7c12c6 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Lock "fe38f557-1df5-4bed-af03-6c2d4887fc4d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.212s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:39:45 compute-0 nova_compute[192698]: 2025-10-01 14:39:45.908 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:39:46 compute-0 nova_compute[192698]: 2025-10-01 14:39:46.027 2 DEBUG oslo_concurrency.lockutils [None req-c229f9a6-192e-4418-9a6d-42dd6cca07a7 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Acquiring lock "9ee46d61-9bf8-493f-a8e0-254e41a9deb8" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:39:46 compute-0 nova_compute[192698]: 2025-10-01 14:39:46.028 2 DEBUG oslo_concurrency.lockutils [None req-c229f9a6-192e-4418-9a6d-42dd6cca07a7 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Lock "9ee46d61-9bf8-493f-a8e0-254e41a9deb8" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:39:46 compute-0 nova_compute[192698]: 2025-10-01 14:39:46.029 2 DEBUG oslo_concurrency.lockutils [None req-c229f9a6-192e-4418-9a6d-42dd6cca07a7 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Acquiring lock "9ee46d61-9bf8-493f-a8e0-254e41a9deb8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:39:46 compute-0 nova_compute[192698]: 2025-10-01 14:39:46.029 2 DEBUG oslo_concurrency.lockutils [None req-c229f9a6-192e-4418-9a6d-42dd6cca07a7 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Lock "9ee46d61-9bf8-493f-a8e0-254e41a9deb8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:39:46 compute-0 nova_compute[192698]: 2025-10-01 14:39:46.030 2 DEBUG oslo_concurrency.lockutils [None req-c229f9a6-192e-4418-9a6d-42dd6cca07a7 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Lock "9ee46d61-9bf8-493f-a8e0-254e41a9deb8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:39:46 compute-0 nova_compute[192698]: 2025-10-01 14:39:46.046 2 INFO nova.compute.manager [None req-c229f9a6-192e-4418-9a6d-42dd6cca07a7 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] [instance: 9ee46d61-9bf8-493f-a8e0-254e41a9deb8] Terminating instance
Oct 01 14:39:46 compute-0 nova_compute[192698]: 2025-10-01 14:39:46.428 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:39:46 compute-0 nova_compute[192698]: 2025-10-01 14:39:46.568 2 DEBUG nova.compute.manager [None req-c229f9a6-192e-4418-9a6d-42dd6cca07a7 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] [instance: 9ee46d61-9bf8-493f-a8e0-254e41a9deb8] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Oct 01 14:39:46 compute-0 kernel: tapb26a270a-4d (unregistering): left promiscuous mode
Oct 01 14:39:46 compute-0 NetworkManager[51741]: <info>  [1759329586.5981] device (tapb26a270a-4d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct 01 14:39:46 compute-0 ovn_controller[94909]: 2025-10-01T14:39:46Z|00285|binding|INFO|Releasing lport b26a270a-4dc0-4403-b36c-0b5edd823d71 from this chassis (sb_readonly=0)
Oct 01 14:39:46 compute-0 ovn_controller[94909]: 2025-10-01T14:39:46Z|00286|binding|INFO|Setting lport b26a270a-4dc0-4403-b36c-0b5edd823d71 down in Southbound
Oct 01 14:39:46 compute-0 nova_compute[192698]: 2025-10-01 14:39:46.612 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:39:46 compute-0 ovn_controller[94909]: 2025-10-01T14:39:46Z|00287|binding|INFO|Removing iface tapb26a270a-4d ovn-installed in OVS
Oct 01 14:39:46 compute-0 nova_compute[192698]: 2025-10-01 14:39:46.615 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:39:46 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:39:46.623 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4a:b0:23 10.100.0.4'], port_security=['fa:16:3e:4a:b0:23 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '9ee46d61-9bf8-493f-a8e0-254e41a9deb8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9271916c-5214-4c09-935e-13b34b50b900', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6c3ff78cc16a4cf58b183cb67bd03327', 'neutron:revision_number': '15', 'neutron:security_group_ids': '9d9fdca1-a833-4d6c-a710-d80f6740c5cd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8a4f5a34-42c0-4655-a64c-41091a291e78, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], logical_port=b26a270a-4dc0-4403-b36c-0b5edd823d71) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:39:46 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:39:46.624 103791 INFO neutron.agent.ovn.metadata.agent [-] Port b26a270a-4dc0-4403-b36c-0b5edd823d71 in datapath 9271916c-5214-4c09-935e-13b34b50b900 unbound from our chassis
Oct 01 14:39:46 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:39:46.626 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9271916c-5214-4c09-935e-13b34b50b900, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 01 14:39:46 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:39:46.627 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[d9cea204-bccd-45bb-98a0-3d239fbc82f7]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:39:46 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:39:46.627 103791 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9271916c-5214-4c09-935e-13b34b50b900 namespace which is not needed anymore
Oct 01 14:39:46 compute-0 nova_compute[192698]: 2025-10-01 14:39:46.644 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:39:46 compute-0 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d00000024.scope: Deactivated successfully.
Oct 01 14:39:46 compute-0 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d00000024.scope: Consumed 2.495s CPU time.
Oct 01 14:39:46 compute-0 systemd-machined[152704]: Machine qemu-27-instance-00000024 terminated.
Oct 01 14:39:46 compute-0 podman[230583]: 2025-10-01 14:39:46.706815628 +0000 UTC m=+0.069115241 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Oct 01 14:39:46 compute-0 podman[230585]: 2025-10-01 14:39:46.742703274 +0000 UTC m=+0.109802856 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Oct 01 14:39:46 compute-0 neutron-haproxy-ovnmeta-9271916c-5214-4c09-935e-13b34b50b900[230224]: [NOTICE]   (230228) : haproxy version is 3.0.5-8e879a5
Oct 01 14:39:46 compute-0 neutron-haproxy-ovnmeta-9271916c-5214-4c09-935e-13b34b50b900[230224]: [NOTICE]   (230228) : path to executable is /usr/sbin/haproxy
Oct 01 14:39:46 compute-0 neutron-haproxy-ovnmeta-9271916c-5214-4c09-935e-13b34b50b900[230224]: [WARNING]  (230228) : Exiting Master process...
Oct 01 14:39:46 compute-0 neutron-haproxy-ovnmeta-9271916c-5214-4c09-935e-13b34b50b900[230224]: [ALERT]    (230228) : Current worker (230230) exited with code 143 (Terminated)
Oct 01 14:39:46 compute-0 neutron-haproxy-ovnmeta-9271916c-5214-4c09-935e-13b34b50b900[230224]: [WARNING]  (230228) : All workers exited. Exiting... (0)
Oct 01 14:39:46 compute-0 podman[230645]: 2025-10-01 14:39:46.781156899 +0000 UTC m=+0.033264246 container kill c390d828b405f999f49579075b59fddd9020b7cbc886e9eef07573ddb29e153f (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-9271916c-5214-4c09-935e-13b34b50b900, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.build-date=20250930)
Oct 01 14:39:46 compute-0 systemd[1]: libpod-c390d828b405f999f49579075b59fddd9020b7cbc886e9eef07573ddb29e153f.scope: Deactivated successfully.
Oct 01 14:39:46 compute-0 conmon[230224]: conmon c390d828b405f999f495 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-c390d828b405f999f49579075b59fddd9020b7cbc886e9eef07573ddb29e153f.scope/container/memory.events
Oct 01 14:39:46 compute-0 kernel: tapb26a270a-4d: entered promiscuous mode
Oct 01 14:39:46 compute-0 systemd-udevd[230608]: Network interface NamePolicy= disabled on kernel command line.
Oct 01 14:39:46 compute-0 NetworkManager[51741]: <info>  [1759329586.7936] manager: (tapb26a270a-4d): new Tun device (/org/freedesktop/NetworkManager/Devices/106)
Oct 01 14:39:46 compute-0 nova_compute[192698]: 2025-10-01 14:39:46.794 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:39:46 compute-0 ovn_controller[94909]: 2025-10-01T14:39:46Z|00288|binding|INFO|Claiming lport b26a270a-4dc0-4403-b36c-0b5edd823d71 for this chassis.
Oct 01 14:39:46 compute-0 ovn_controller[94909]: 2025-10-01T14:39:46Z|00289|binding|INFO|b26a270a-4dc0-4403-b36c-0b5edd823d71: Claiming fa:16:3e:4a:b0:23 10.100.0.4
Oct 01 14:39:46 compute-0 kernel: tapb26a270a-4d (unregistering): left promiscuous mode
Oct 01 14:39:46 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:39:46.805 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4a:b0:23 10.100.0.4'], port_security=['fa:16:3e:4a:b0:23 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '9ee46d61-9bf8-493f-a8e0-254e41a9deb8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9271916c-5214-4c09-935e-13b34b50b900', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6c3ff78cc16a4cf58b183cb67bd03327', 'neutron:revision_number': '15', 'neutron:security_group_ids': '9d9fdca1-a833-4d6c-a710-d80f6740c5cd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8a4f5a34-42c0-4655-a64c-41091a291e78, chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], logical_port=b26a270a-4dc0-4403-b36c-0b5edd823d71) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:39:46 compute-0 ovn_controller[94909]: 2025-10-01T14:39:46Z|00290|binding|INFO|Setting lport b26a270a-4dc0-4403-b36c-0b5edd823d71 ovn-installed in OVS
Oct 01 14:39:46 compute-0 ovn_controller[94909]: 2025-10-01T14:39:46Z|00291|binding|INFO|Setting lport b26a270a-4dc0-4403-b36c-0b5edd823d71 up in Southbound
Oct 01 14:39:46 compute-0 ovn_controller[94909]: 2025-10-01T14:39:46Z|00292|binding|INFO|Releasing lport b26a270a-4dc0-4403-b36c-0b5edd823d71 from this chassis (sb_readonly=1)
Oct 01 14:39:46 compute-0 nova_compute[192698]: 2025-10-01 14:39:46.814 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:39:46 compute-0 ovn_controller[94909]: 2025-10-01T14:39:46Z|00293|if_status|INFO|Not setting lport b26a270a-4dc0-4403-b36c-0b5edd823d71 down as sb is readonly
Oct 01 14:39:46 compute-0 ovn_controller[94909]: 2025-10-01T14:39:46Z|00294|binding|INFO|Removing iface tapb26a270a-4d ovn-installed in OVS
Oct 01 14:39:46 compute-0 ovn_controller[94909]: 2025-10-01T14:39:46Z|00295|binding|INFO|Releasing lport b26a270a-4dc0-4403-b36c-0b5edd823d71 from this chassis (sb_readonly=0)
Oct 01 14:39:46 compute-0 ovn_controller[94909]: 2025-10-01T14:39:46Z|00296|binding|INFO|Setting lport b26a270a-4dc0-4403-b36c-0b5edd823d71 down in Southbound
Oct 01 14:39:46 compute-0 nova_compute[192698]: 2025-10-01 14:39:46.825 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:39:46 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:39:46.830 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4a:b0:23 10.100.0.4'], port_security=['fa:16:3e:4a:b0:23 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '9ee46d61-9bf8-493f-a8e0-254e41a9deb8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9271916c-5214-4c09-935e-13b34b50b900', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6c3ff78cc16a4cf58b183cb67bd03327', 'neutron:revision_number': '15', 'neutron:security_group_ids': '9d9fdca1-a833-4d6c-a710-d80f6740c5cd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8a4f5a34-42c0-4655-a64c-41091a291e78, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>], logical_port=b26a270a-4dc0-4403-b36c-0b5edd823d71) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7b1d951e80>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:39:46 compute-0 podman[230663]: 2025-10-01 14:39:46.836709985 +0000 UTC m=+0.029662130 container died c390d828b405f999f49579075b59fddd9020b7cbc886e9eef07573ddb29e153f (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-9271916c-5214-4c09-935e-13b34b50b900, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 01 14:39:46 compute-0 nova_compute[192698]: 2025-10-01 14:39:46.842 2 DEBUG nova.compute.manager [req-fd80aefa-bea0-4e5e-8ab1-ab7b79cdf0ce req-81711e0a-c187-4f70-a7c7-5fe40c2e4c71 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9ee46d61-9bf8-493f-a8e0-254e41a9deb8] Received event network-vif-unplugged-b26a270a-4dc0-4403-b36c-0b5edd823d71 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:39:46 compute-0 nova_compute[192698]: 2025-10-01 14:39:46.843 2 DEBUG oslo_concurrency.lockutils [req-fd80aefa-bea0-4e5e-8ab1-ab7b79cdf0ce req-81711e0a-c187-4f70-a7c7-5fe40c2e4c71 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "9ee46d61-9bf8-493f-a8e0-254e41a9deb8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:39:46 compute-0 nova_compute[192698]: 2025-10-01 14:39:46.843 2 DEBUG oslo_concurrency.lockutils [req-fd80aefa-bea0-4e5e-8ab1-ab7b79cdf0ce req-81711e0a-c187-4f70-a7c7-5fe40c2e4c71 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "9ee46d61-9bf8-493f-a8e0-254e41a9deb8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:39:46 compute-0 nova_compute[192698]: 2025-10-01 14:39:46.844 2 DEBUG oslo_concurrency.lockutils [req-fd80aefa-bea0-4e5e-8ab1-ab7b79cdf0ce req-81711e0a-c187-4f70-a7c7-5fe40c2e4c71 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "9ee46d61-9bf8-493f-a8e0-254e41a9deb8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:39:46 compute-0 nova_compute[192698]: 2025-10-01 14:39:46.844 2 DEBUG nova.compute.manager [req-fd80aefa-bea0-4e5e-8ab1-ab7b79cdf0ce req-81711e0a-c187-4f70-a7c7-5fe40c2e4c71 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9ee46d61-9bf8-493f-a8e0-254e41a9deb8] No waiting events found dispatching network-vif-unplugged-b26a270a-4dc0-4403-b36c-0b5edd823d71 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:39:46 compute-0 nova_compute[192698]: 2025-10-01 14:39:46.844 2 DEBUG nova.compute.manager [req-fd80aefa-bea0-4e5e-8ab1-ab7b79cdf0ce req-81711e0a-c187-4f70-a7c7-5fe40c2e4c71 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9ee46d61-9bf8-493f-a8e0-254e41a9deb8] Received event network-vif-unplugged-b26a270a-4dc0-4403-b36c-0b5edd823d71 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 01 14:39:46 compute-0 nova_compute[192698]: 2025-10-01 14:39:46.856 2 INFO nova.virt.libvirt.driver [-] [instance: 9ee46d61-9bf8-493f-a8e0-254e41a9deb8] Instance destroyed successfully.
Oct 01 14:39:46 compute-0 nova_compute[192698]: 2025-10-01 14:39:46.857 2 DEBUG nova.objects.instance [None req-c229f9a6-192e-4418-9a6d-42dd6cca07a7 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Lazy-loading 'resources' on Instance uuid 9ee46d61-9bf8-493f-a8e0-254e41a9deb8 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Oct 01 14:39:46 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c390d828b405f999f49579075b59fddd9020b7cbc886e9eef07573ddb29e153f-userdata-shm.mount: Deactivated successfully.
Oct 01 14:39:46 compute-0 systemd[1]: var-lib-containers-storage-overlay-412c63c195865ccee0bc02697d5fe32db90fe89db7db68b79dd7578213986241-merged.mount: Deactivated successfully.
Oct 01 14:39:46 compute-0 podman[230663]: 2025-10-01 14:39:46.888607832 +0000 UTC m=+0.081559957 container cleanup c390d828b405f999f49579075b59fddd9020b7cbc886e9eef07573ddb29e153f (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-9271916c-5214-4c09-935e-13b34b50b900, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct 01 14:39:46 compute-0 systemd[1]: libpod-conmon-c390d828b405f999f49579075b59fddd9020b7cbc886e9eef07573ddb29e153f.scope: Deactivated successfully.
Oct 01 14:39:46 compute-0 podman[230672]: 2025-10-01 14:39:46.905344902 +0000 UTC m=+0.077834696 container remove c390d828b405f999f49579075b59fddd9020b7cbc886e9eef07573ddb29e153f (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-9271916c-5214-4c09-935e-13b34b50b900, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct 01 14:39:46 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:39:46.924 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[d318cdac-7607-44f1-a9ac-aed6e623b737]: (4, ("Wed Oct  1 02:39:46 PM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-9271916c-5214-4c09-935e-13b34b50b900 (c390d828b405f999f49579075b59fddd9020b7cbc886e9eef07573ddb29e153f)\nc390d828b405f999f49579075b59fddd9020b7cbc886e9eef07573ddb29e153f\nWed Oct  1 02:39:46 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9271916c-5214-4c09-935e-13b34b50b900 (c390d828b405f999f49579075b59fddd9020b7cbc886e9eef07573ddb29e153f)\nc390d828b405f999f49579075b59fddd9020b7cbc886e9eef07573ddb29e153f\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:39:46 compute-0 nova_compute[192698]: 2025-10-01 14:39:46.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:39:46 compute-0 nova_compute[192698]: 2025-10-01 14:39:46.925 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 01 14:39:46 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:39:46.926 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[eef5109d-9b6a-4f17-854c-7162b337d2fa]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:39:46 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:39:46.926 103791 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9271916c-5214-4c09-935e-13b34b50b900.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9271916c-5214-4c09-935e-13b34b50b900.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Oct 01 14:39:46 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:39:46.927 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[f5dc93e6-f3f5-4e55-ba07-8577adab5d24]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:39:46 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:39:46.928 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9271916c-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:39:46 compute-0 nova_compute[192698]: 2025-10-01 14:39:46.930 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:39:46 compute-0 kernel: tap9271916c-50: left promiscuous mode
Oct 01 14:39:46 compute-0 nova_compute[192698]: 2025-10-01 14:39:46.951 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:39:46 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:39:46.954 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[1dd0824c-527d-4b95-8659-87b9f6aff90b]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:39:46 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:39:46.978 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[3512cd89-2be0-4dba-90cd-d67185489450]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:39:46 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:39:46.979 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[77436813-8e33-4939-94f4-5a8c4b51a73c]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:39:46 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:39:46.994 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[827fd410-9f22-43a6-be2e-222d1fa0a17d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 589635, 'reachable_time': 20558, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230705, 'error': None, 'target': 'ovnmeta-9271916c-5214-4c09-935e-13b34b50b900', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:39:46 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:39:46.997 103910 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9271916c-5214-4c09-935e-13b34b50b900 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Oct 01 14:39:46 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:39:46.998 103910 DEBUG oslo.privsep.daemon [-] privsep: reply[d797731e-c4f0-4ae6-a6ef-19827bd91028]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:39:46 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:39:46.998 103791 INFO neutron.agent.ovn.metadata.agent [-] Port b26a270a-4dc0-4403-b36c-0b5edd823d71 in datapath 9271916c-5214-4c09-935e-13b34b50b900 unbound from our chassis
Oct 01 14:39:47 compute-0 systemd[1]: run-netns-ovnmeta\x2d9271916c\x2d5214\x2d4c09\x2d935e\x2d13b34b50b900.mount: Deactivated successfully.
Oct 01 14:39:47 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:39:47.000 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9271916c-5214-4c09-935e-13b34b50b900, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 01 14:39:47 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:39:47.001 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[7696e537-6a2c-4f10-b64b-555d5b0d281b]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:39:47 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:39:47.001 103791 INFO neutron.agent.ovn.metadata.agent [-] Port b26a270a-4dc0-4403-b36c-0b5edd823d71 in datapath 9271916c-5214-4c09-935e-13b34b50b900 unbound from our chassis
Oct 01 14:39:47 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:39:47.002 103791 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9271916c-5214-4c09-935e-13b34b50b900, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Oct 01 14:39:47 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:39:47.003 214114 DEBUG oslo.privsep.daemon [-] privsep: reply[431040d5-9f7b-443a-992f-5f3d7c04f060]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Oct 01 14:39:47 compute-0 nova_compute[192698]: 2025-10-01 14:39:47.368 2 DEBUG nova.virt.libvirt.vif [None req-c229f9a6-192e-4418-9a6d-42dd6cca07a7 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-10-01T14:38:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-1169481367',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-1169481367',id=36,image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-01T14:38:37Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6c3ff78cc16a4cf58b183cb67bd03327',ramdisk_id='',reservation_id='r-vchyp5cq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,admin,manager',clean_attempts='1',image_base_image_ref='48696e9b-a20d-4bf6-8ac2-6438fe748ab6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-1109679165',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-1109679165-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-01T14:39:35Z,user_data=None,user_id='0a06999394394dbba2b16c054834a1a7',uuid=9ee46d61-9bf8-493f-a8e0-254e41a9deb8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b26a270a-4dc0-4403-b36c-0b5edd823d71", "address": "fa:16:3e:4a:b0:23", "network": {"id": "9271916c-5214-4c09-935e-13b34b50b900", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-146524267-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bae9f8c3123c4b158b8c2b37547b3432", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb26a270a-4d", "ovs_interfaceid": "b26a270a-4dc0-4403-b36c-0b5edd823d71", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Oct 01 14:39:47 compute-0 nova_compute[192698]: 2025-10-01 14:39:47.369 2 DEBUG nova.network.os_vif_util [None req-c229f9a6-192e-4418-9a6d-42dd6cca07a7 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Converting VIF {"id": "b26a270a-4dc0-4403-b36c-0b5edd823d71", "address": "fa:16:3e:4a:b0:23", "network": {"id": "9271916c-5214-4c09-935e-13b34b50b900", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-146524267-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bae9f8c3123c4b158b8c2b37547b3432", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb26a270a-4d", "ovs_interfaceid": "b26a270a-4dc0-4403-b36c-0b5edd823d71", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Oct 01 14:39:47 compute-0 nova_compute[192698]: 2025-10-01 14:39:47.370 2 DEBUG nova.network.os_vif_util [None req-c229f9a6-192e-4418-9a6d-42dd6cca07a7 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:4a:b0:23,bridge_name='br-int',has_traffic_filtering=True,id=b26a270a-4dc0-4403-b36c-0b5edd823d71,network=Network(9271916c-5214-4c09-935e-13b34b50b900),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb26a270a-4d') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Oct 01 14:39:47 compute-0 nova_compute[192698]: 2025-10-01 14:39:47.371 2 DEBUG os_vif [None req-c229f9a6-192e-4418-9a6d-42dd6cca07a7 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:4a:b0:23,bridge_name='br-int',has_traffic_filtering=True,id=b26a270a-4dc0-4403-b36c-0b5edd823d71,network=Network(9271916c-5214-4c09-935e-13b34b50b900),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb26a270a-4d') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Oct 01 14:39:47 compute-0 nova_compute[192698]: 2025-10-01 14:39:47.374 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:39:47 compute-0 nova_compute[192698]: 2025-10-01 14:39:47.375 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb26a270a-4d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:39:47 compute-0 nova_compute[192698]: 2025-10-01 14:39:47.380 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Oct 01 14:39:47 compute-0 nova_compute[192698]: 2025-10-01 14:39:47.381 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:39:47 compute-0 nova_compute[192698]: 2025-10-01 14:39:47.381 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=3feee353-d3f3-4d11-bd3b-aa81880f244f) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:39:47 compute-0 nova_compute[192698]: 2025-10-01 14:39:47.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:39:47 compute-0 nova_compute[192698]: 2025-10-01 14:39:47.387 2 INFO os_vif [None req-c229f9a6-192e-4418-9a6d-42dd6cca07a7 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:4a:b0:23,bridge_name='br-int',has_traffic_filtering=True,id=b26a270a-4dc0-4403-b36c-0b5edd823d71,network=Network(9271916c-5214-4c09-935e-13b34b50b900),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb26a270a-4d')
Oct 01 14:39:47 compute-0 nova_compute[192698]: 2025-10-01 14:39:47.388 2 INFO nova.virt.libvirt.driver [None req-c229f9a6-192e-4418-9a6d-42dd6cca07a7 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] [instance: 9ee46d61-9bf8-493f-a8e0-254e41a9deb8] Deleting instance files /var/lib/nova/instances/9ee46d61-9bf8-493f-a8e0-254e41a9deb8_del
Oct 01 14:39:47 compute-0 nova_compute[192698]: 2025-10-01 14:39:47.389 2 INFO nova.virt.libvirt.driver [None req-c229f9a6-192e-4418-9a6d-42dd6cca07a7 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] [instance: 9ee46d61-9bf8-493f-a8e0-254e41a9deb8] Deletion of /var/lib/nova/instances/9ee46d61-9bf8-493f-a8e0-254e41a9deb8_del complete
Oct 01 14:39:47 compute-0 nova_compute[192698]: 2025-10-01 14:39:47.902 2 INFO nova.compute.manager [None req-c229f9a6-192e-4418-9a6d-42dd6cca07a7 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] [instance: 9ee46d61-9bf8-493f-a8e0-254e41a9deb8] Took 1.33 seconds to destroy the instance on the hypervisor.
Oct 01 14:39:47 compute-0 nova_compute[192698]: 2025-10-01 14:39:47.903 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-c229f9a6-192e-4418-9a6d-42dd6cca07a7 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Oct 01 14:39:47 compute-0 nova_compute[192698]: 2025-10-01 14:39:47.903 2 DEBUG nova.compute.manager [-] [instance: 9ee46d61-9bf8-493f-a8e0-254e41a9deb8] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Oct 01 14:39:47 compute-0 nova_compute[192698]: 2025-10-01 14:39:47.904 2 DEBUG nova.network.neutron [-] [instance: 9ee46d61-9bf8-493f-a8e0-254e41a9deb8] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Oct 01 14:39:47 compute-0 nova_compute[192698]: 2025-10-01 14:39:47.904 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:39:48 compute-0 nova_compute[192698]: 2025-10-01 14:39:48.768 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Oct 01 14:39:48 compute-0 nova_compute[192698]: 2025-10-01 14:39:48.926 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:39:49 compute-0 nova_compute[192698]: 2025-10-01 14:39:49.486 2 DEBUG nova.compute.manager [req-9a541023-6283-4aae-93bc-0c99a5eebacf req-3bcb25b0-90b5-49fb-ae45-d03d3f1dcb2e 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9ee46d61-9bf8-493f-a8e0-254e41a9deb8] Received event network-vif-unplugged-b26a270a-4dc0-4403-b36c-0b5edd823d71 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:39:49 compute-0 nova_compute[192698]: 2025-10-01 14:39:49.487 2 DEBUG oslo_concurrency.lockutils [req-9a541023-6283-4aae-93bc-0c99a5eebacf req-3bcb25b0-90b5-49fb-ae45-d03d3f1dcb2e 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Acquiring lock "9ee46d61-9bf8-493f-a8e0-254e41a9deb8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:39:49 compute-0 nova_compute[192698]: 2025-10-01 14:39:49.487 2 DEBUG oslo_concurrency.lockutils [req-9a541023-6283-4aae-93bc-0c99a5eebacf req-3bcb25b0-90b5-49fb-ae45-d03d3f1dcb2e 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "9ee46d61-9bf8-493f-a8e0-254e41a9deb8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:39:49 compute-0 nova_compute[192698]: 2025-10-01 14:39:49.488 2 DEBUG oslo_concurrency.lockutils [req-9a541023-6283-4aae-93bc-0c99a5eebacf req-3bcb25b0-90b5-49fb-ae45-d03d3f1dcb2e 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] Lock "9ee46d61-9bf8-493f-a8e0-254e41a9deb8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:39:49 compute-0 nova_compute[192698]: 2025-10-01 14:39:49.488 2 DEBUG nova.compute.manager [req-9a541023-6283-4aae-93bc-0c99a5eebacf req-3bcb25b0-90b5-49fb-ae45-d03d3f1dcb2e 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9ee46d61-9bf8-493f-a8e0-254e41a9deb8] No waiting events found dispatching network-vif-unplugged-b26a270a-4dc0-4403-b36c-0b5edd823d71 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Oct 01 14:39:49 compute-0 nova_compute[192698]: 2025-10-01 14:39:49.489 2 DEBUG nova.compute.manager [req-9a541023-6283-4aae-93bc-0c99a5eebacf req-3bcb25b0-90b5-49fb-ae45-d03d3f1dcb2e 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9ee46d61-9bf8-493f-a8e0-254e41a9deb8] Received event network-vif-unplugged-b26a270a-4dc0-4403-b36c-0b5edd823d71 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Oct 01 14:39:49 compute-0 nova_compute[192698]: 2025-10-01 14:39:49.672 2 DEBUG nova.compute.manager [req-448f02d7-a86a-4a18-8653-e1f019ce8f31 req-86490121-69d4-4be7-93a6-169136a8bb76 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9ee46d61-9bf8-493f-a8e0-254e41a9deb8] Received event network-vif-deleted-b26a270a-4dc0-4403-b36c-0b5edd823d71 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Oct 01 14:39:49 compute-0 nova_compute[192698]: 2025-10-01 14:39:49.673 2 INFO nova.compute.manager [req-448f02d7-a86a-4a18-8653-e1f019ce8f31 req-86490121-69d4-4be7-93a6-169136a8bb76 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9ee46d61-9bf8-493f-a8e0-254e41a9deb8] Neutron deleted interface b26a270a-4dc0-4403-b36c-0b5edd823d71; detaching it from the instance and deleting it from the info cache
Oct 01 14:39:49 compute-0 nova_compute[192698]: 2025-10-01 14:39:49.673 2 DEBUG nova.network.neutron [req-448f02d7-a86a-4a18-8653-e1f019ce8f31 req-86490121-69d4-4be7-93a6-169136a8bb76 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9ee46d61-9bf8-493f-a8e0-254e41a9deb8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:39:50 compute-0 nova_compute[192698]: 2025-10-01 14:39:50.102 2 DEBUG nova.network.neutron [-] [instance: 9ee46d61-9bf8-493f-a8e0-254e41a9deb8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Oct 01 14:39:50 compute-0 nova_compute[192698]: 2025-10-01 14:39:50.182 2 DEBUG nova.compute.manager [req-448f02d7-a86a-4a18-8653-e1f019ce8f31 req-86490121-69d4-4be7-93a6-169136a8bb76 3978d2118493438d8590356c6cba06f2 31dc4633e7f44f51a82d0af792f040b8 - - default default] [instance: 9ee46d61-9bf8-493f-a8e0-254e41a9deb8] Detach interface failed, port_id=b26a270a-4dc0-4403-b36c-0b5edd823d71, reason: Instance 9ee46d61-9bf8-493f-a8e0-254e41a9deb8 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Oct 01 14:39:50 compute-0 nova_compute[192698]: 2025-10-01 14:39:50.609 2 INFO nova.compute.manager [-] [instance: 9ee46d61-9bf8-493f-a8e0-254e41a9deb8] Took 2.71 seconds to deallocate network for instance.
Oct 01 14:39:50 compute-0 nova_compute[192698]: 2025-10-01 14:39:50.909 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:39:51 compute-0 nova_compute[192698]: 2025-10-01 14:39:51.137 2 DEBUG oslo_concurrency.lockutils [None req-c229f9a6-192e-4418-9a6d-42dd6cca07a7 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:39:51 compute-0 nova_compute[192698]: 2025-10-01 14:39:51.137 2 DEBUG oslo_concurrency.lockutils [None req-c229f9a6-192e-4418-9a6d-42dd6cca07a7 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:39:51 compute-0 nova_compute[192698]: 2025-10-01 14:39:51.199 2 DEBUG nova.compute.provider_tree [None req-c229f9a6-192e-4418-9a6d-42dd6cca07a7 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:39:51 compute-0 nova_compute[192698]: 2025-10-01 14:39:51.707 2 DEBUG nova.scheduler.client.report [None req-c229f9a6-192e-4418-9a6d-42dd6cca07a7 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:39:52 compute-0 nova_compute[192698]: 2025-10-01 14:39:52.218 2 DEBUG oslo_concurrency.lockutils [None req-c229f9a6-192e-4418-9a6d-42dd6cca07a7 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.081s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:39:52 compute-0 nova_compute[192698]: 2025-10-01 14:39:52.250 2 INFO nova.scheduler.client.report [None req-c229f9a6-192e-4418-9a6d-42dd6cca07a7 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Deleted allocations for instance 9ee46d61-9bf8-493f-a8e0-254e41a9deb8
Oct 01 14:39:52 compute-0 nova_compute[192698]: 2025-10-01 14:39:52.383 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:39:53 compute-0 nova_compute[192698]: 2025-10-01 14:39:53.283 2 DEBUG oslo_concurrency.lockutils [None req-c229f9a6-192e-4418-9a6d-42dd6cca07a7 0a06999394394dbba2b16c054834a1a7 6c3ff78cc16a4cf58b183cb67bd03327 - - default default] Lock "9ee46d61-9bf8-493f-a8e0-254e41a9deb8" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.255s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:39:54 compute-0 podman[230707]: 2025-10-01 14:39:54.223683115 +0000 UTC m=+0.126572008 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, vcs-type=git, vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, config_id=edpm, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9)
Oct 01 14:39:55 compute-0 nova_compute[192698]: 2025-10-01 14:39:55.912 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:39:56 compute-0 nova_compute[192698]: 2025-10-01 14:39:56.537 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:39:57 compute-0 nova_compute[192698]: 2025-10-01 14:39:57.386 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:39:59 compute-0 podman[203144]: time="2025-10-01T14:39:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:39:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:39:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:39:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:39:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3023 "" "Go-http-client/1.1"
Oct 01 14:40:00 compute-0 podman[230730]: 2025-10-01 14:40:00.143631798 +0000 UTC m=+0.062413881 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=iscsid, org.label-schema.license=GPLv2, tcib_managed=true)
Oct 01 14:40:00 compute-0 podman[230731]: 2025-10-01 14:40:00.156401401 +0000 UTC m=+0.072626306 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=multipathd)
Oct 01 14:40:00 compute-0 nova_compute[192698]: 2025-10-01 14:40:00.915 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:40:01 compute-0 openstack_network_exporter[205307]: ERROR   14:40:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:40:01 compute-0 openstack_network_exporter[205307]: ERROR   14:40:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:40:01 compute-0 openstack_network_exporter[205307]: ERROR   14:40:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:40:01 compute-0 openstack_network_exporter[205307]: ERROR   14:40:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:40:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:40:01 compute-0 openstack_network_exporter[205307]: ERROR   14:40:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:40:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:40:02 compute-0 nova_compute[192698]: 2025-10-01 14:40:02.388 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:40:05 compute-0 podman[230767]: 2025-10-01 14:40:05.175096637 +0000 UTC m=+0.084092114 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 01 14:40:05 compute-0 nova_compute[192698]: 2025-10-01 14:40:05.917 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:40:07 compute-0 nova_compute[192698]: 2025-10-01 14:40:07.446 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:40:10 compute-0 nova_compute[192698]: 2025-10-01 14:40:10.948 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:40:12 compute-0 nova_compute[192698]: 2025-10-01 14:40:12.449 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:40:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:40:14.317 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:40:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:40:14.317 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:40:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:40:14.317 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:40:15 compute-0 nova_compute[192698]: 2025-10-01 14:40:15.952 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:40:17 compute-0 podman[230795]: 2025-10-01 14:40:17.15644565 +0000 UTC m=+0.072009820 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Oct 01 14:40:17 compute-0 podman[230796]: 2025-10-01 14:40:17.207701359 +0000 UTC m=+0.120143505 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20250930, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller)
Oct 01 14:40:17 compute-0 nova_compute[192698]: 2025-10-01 14:40:17.451 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:40:20 compute-0 nova_compute[192698]: 2025-10-01 14:40:20.954 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:40:22 compute-0 nova_compute[192698]: 2025-10-01 14:40:22.454 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:40:25 compute-0 podman[230838]: 2025-10-01 14:40:25.146906084 +0000 UTC m=+0.064727204 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, distribution-scope=public, vcs-type=git, architecture=x86_64, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.expose-services=, release=1755695350, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9)
Oct 01 14:40:25 compute-0 nova_compute[192698]: 2025-10-01 14:40:25.958 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:40:27 compute-0 nova_compute[192698]: 2025-10-01 14:40:27.455 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:40:29 compute-0 podman[203144]: time="2025-10-01T14:40:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:40:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:40:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:40:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:40:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3027 "" "Go-http-client/1.1"
Oct 01 14:40:30 compute-0 ovn_controller[94909]: 2025-10-01T14:40:30Z|00297|memory_trim|INFO|Detected inactivity (last active 30014 ms ago): trimming memory
Oct 01 14:40:30 compute-0 nova_compute[192698]: 2025-10-01 14:40:30.959 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:40:31 compute-0 podman[230859]: 2025-10-01 14:40:31.172564242 +0000 UTC m=+0.081745521 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, config_id=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct 01 14:40:31 compute-0 podman[230860]: 2025-10-01 14:40:31.192115188 +0000 UTC m=+0.097671430 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Oct 01 14:40:31 compute-0 openstack_network_exporter[205307]: ERROR   14:40:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:40:31 compute-0 openstack_network_exporter[205307]: ERROR   14:40:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:40:31 compute-0 openstack_network_exporter[205307]: ERROR   14:40:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:40:31 compute-0 openstack_network_exporter[205307]: ERROR   14:40:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:40:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:40:31 compute-0 openstack_network_exporter[205307]: ERROR   14:40:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:40:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:40:32 compute-0 nova_compute[192698]: 2025-10-01 14:40:32.495 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:40:35 compute-0 nova_compute[192698]: 2025-10-01 14:40:35.961 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:40:36 compute-0 podman[230901]: 2025-10-01 14:40:36.185232975 +0000 UTC m=+0.085439911 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 01 14:40:37 compute-0 nova_compute[192698]: 2025-10-01 14:40:37.497 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:40:38 compute-0 nova_compute[192698]: 2025-10-01 14:40:38.924 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:40:39 compute-0 nova_compute[192698]: 2025-10-01 14:40:39.441 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:40:39 compute-0 nova_compute[192698]: 2025-10-01 14:40:39.441 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:40:39 compute-0 nova_compute[192698]: 2025-10-01 14:40:39.441 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:40:39 compute-0 nova_compute[192698]: 2025-10-01 14:40:39.442 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 01 14:40:39 compute-0 nova_compute[192698]: 2025-10-01 14:40:39.595 2 WARNING nova.virt.libvirt.driver [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:40:39 compute-0 nova_compute[192698]: 2025-10-01 14:40:39.597 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:40:39 compute-0 nova_compute[192698]: 2025-10-01 14:40:39.619 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.022s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:40:39 compute-0 nova_compute[192698]: 2025-10-01 14:40:39.620 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5843MB free_disk=73.3017807006836GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 01 14:40:39 compute-0 nova_compute[192698]: 2025-10-01 14:40:39.620 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:40:39 compute-0 nova_compute[192698]: 2025-10-01 14:40:39.621 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:40:40 compute-0 nova_compute[192698]: 2025-10-01 14:40:40.672 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 01 14:40:40 compute-0 nova_compute[192698]: 2025-10-01 14:40:40.672 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:40:39 up  1:40,  0 user,  load average: 0.16, 0.19, 0.21\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 01 14:40:40 compute-0 nova_compute[192698]: 2025-10-01 14:40:40.692 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:40:40 compute-0 nova_compute[192698]: 2025-10-01 14:40:40.963 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:40:41 compute-0 nova_compute[192698]: 2025-10-01 14:40:41.200 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:40:41 compute-0 nova_compute[192698]: 2025-10-01 14:40:41.713 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 01 14:40:41 compute-0 nova_compute[192698]: 2025-10-01 14:40:41.713 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.093s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:40:42 compute-0 nova_compute[192698]: 2025-10-01 14:40:42.500 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:40:43 compute-0 nova_compute[192698]: 2025-10-01 14:40:43.714 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:40:43 compute-0 nova_compute[192698]: 2025-10-01 14:40:43.715 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:40:43 compute-0 nova_compute[192698]: 2025-10-01 14:40:43.716 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:40:43 compute-0 nova_compute[192698]: 2025-10-01 14:40:43.716 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:40:45 compute-0 nova_compute[192698]: 2025-10-01 14:40:45.967 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:40:46 compute-0 nova_compute[192698]: 2025-10-01 14:40:46.915 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:40:47 compute-0 nova_compute[192698]: 2025-10-01 14:40:47.503 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:40:48 compute-0 podman[230929]: 2025-10-01 14:40:48.137497475 +0000 UTC m=+0.054312913 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest)
Oct 01 14:40:48 compute-0 podman[230930]: 2025-10-01 14:40:48.20120806 +0000 UTC m=+0.101415361 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller)
Oct 01 14:40:48 compute-0 nova_compute[192698]: 2025-10-01 14:40:48.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:40:48 compute-0 nova_compute[192698]: 2025-10-01 14:40:48.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:40:48 compute-0 nova_compute[192698]: 2025-10-01 14:40:48.925 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 01 14:40:50 compute-0 nova_compute[192698]: 2025-10-01 14:40:50.969 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:40:52 compute-0 nova_compute[192698]: 2025-10-01 14:40:52.506 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:40:53 compute-0 sshd-session[230973]: Accepted publickey for zuul from 192.168.122.10 port 42038 ssh2: ECDSA SHA256:G/wBH4NemtaB5A4Xrsc6R+GZmi6HC8VbviS/FKhdd8M
Oct 01 14:40:53 compute-0 systemd-logind[791]: New session 35 of user zuul.
Oct 01 14:40:53 compute-0 systemd[1]: Started Session 35 of User zuul.
Oct 01 14:40:53 compute-0 sshd-session[230973]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 01 14:40:53 compute-0 sudo[230977]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp -p container,openstack_edpm,system,storage,virt'
Oct 01 14:40:53 compute-0 sudo[230977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 14:40:55 compute-0 nova_compute[192698]: 2025-10-01 14:40:55.970 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:40:56 compute-0 podman[231114]: 2025-10-01 14:40:56.19486458 +0000 UTC m=+0.100011443 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., release=1755695350, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal)
Oct 01 14:40:57 compute-0 nova_compute[192698]: 2025-10-01 14:40:57.508 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:40:58 compute-0 ovs-vsctl[231171]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Oct 01 14:40:59 compute-0 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 231001 (sos)
Oct 01 14:40:59 compute-0 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Oct 01 14:40:59 compute-0 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Oct 01 14:40:59 compute-0 podman[203144]: time="2025-10-01T14:40:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:40:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:40:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:40:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:40:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3032 "" "Go-http-client/1.1"
Oct 01 14:40:59 compute-0 virtqemud[192597]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Oct 01 14:40:59 compute-0 virtqemud[192597]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Oct 01 14:40:59 compute-0 virtqemud[192597]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Oct 01 14:41:00 compute-0 kernel: block sr0: the capability attribute has been deprecated.
Oct 01 14:41:00 compute-0 nova_compute[192698]: 2025-10-01 14:41:00.976 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:41:01 compute-0 crontab[231587]: (root) LIST (root)
Oct 01 14:41:01 compute-0 openstack_network_exporter[205307]: ERROR   14:41:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:41:01 compute-0 openstack_network_exporter[205307]: ERROR   14:41:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:41:01 compute-0 openstack_network_exporter[205307]: ERROR   14:41:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:41:01 compute-0 openstack_network_exporter[205307]: ERROR   14:41:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:41:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:41:01 compute-0 openstack_network_exporter[205307]: ERROR   14:41:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:41:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:41:02 compute-0 podman[231668]: 2025-10-01 14:41:02.174550122 +0000 UTC m=+0.069851811 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Oct 01 14:41:02 compute-0 podman[231664]: 2025-10-01 14:41:02.201341883 +0000 UTC m=+0.097625929 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_id=iscsid, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4)
Oct 01 14:41:02 compute-0 nova_compute[192698]: 2025-10-01 14:41:02.511 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:41:02 compute-0 sshd-session[231715]: Invalid user  from 65.49.1.239 port 54137
Oct 01 14:41:03 compute-0 systemd[1]: Starting Hostname Service...
Oct 01 14:41:03 compute-0 systemd[1]: Started Hostname Service.
Oct 01 14:41:05 compute-0 nova_compute[192698]: 2025-10-01 14:41:05.974 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:41:06 compute-0 sshd-session[231715]: Connection closed by invalid user  65.49.1.239 port 54137 [preauth]
Oct 01 14:41:07 compute-0 podman[231987]: 2025-10-01 14:41:07.164071411 +0000 UTC m=+0.074243210 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 01 14:41:07 compute-0 nova_compute[192698]: 2025-10-01 14:41:07.513 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:41:10 compute-0 ovs-appctl[232789]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Oct 01 14:41:10 compute-0 ovs-appctl[232795]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Oct 01 14:41:10 compute-0 ovs-appctl[232810]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Oct 01 14:41:10 compute-0 nova_compute[192698]: 2025-10-01 14:41:10.977 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:41:12 compute-0 nova_compute[192698]: 2025-10-01 14:41:12.515 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:41:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:41:14.319 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:41:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:41:14.319 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:41:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:41:14.319 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:41:16 compute-0 nova_compute[192698]: 2025-10-01 14:41:16.019 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:41:17 compute-0 nova_compute[192698]: 2025-10-01 14:41:17.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:41:18 compute-0 podman[233825]: 2025-10-01 14:41:18.320251945 +0000 UTC m=+0.109114008 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent)
Oct 01 14:41:18 compute-0 podman[233856]: 2025-10-01 14:41:18.391009169 +0000 UTC m=+0.145446476 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20250930, io.buildah.version=1.41.4)
Oct 01 14:41:18 compute-0 virtqemud[192597]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Oct 01 14:41:21 compute-0 nova_compute[192698]: 2025-10-01 14:41:21.065 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:41:21 compute-0 systemd[1]: Starting Time & Date Service...
Oct 01 14:41:21 compute-0 systemd[1]: Started Time & Date Service.
Oct 01 14:41:22 compute-0 nova_compute[192698]: 2025-10-01 14:41:22.521 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:41:26 compute-0 nova_compute[192698]: 2025-10-01 14:41:26.068 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:41:26 compute-0 podman[234248]: 2025-10-01 14:41:26.367884728 +0000 UTC m=+0.108373458 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, vcs-type=git, version=9.6, architecture=x86_64, name=ubi9-minimal, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.expose-services=, maintainer=Red Hat, Inc., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., config_id=edpm, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct 01 14:41:27 compute-0 nova_compute[192698]: 2025-10-01 14:41:27.523 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:41:28 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Oct 01 14:41:29 compute-0 podman[203144]: time="2025-10-01T14:41:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:41:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:41:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:41:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:41:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3028 "" "Go-http-client/1.1"
Oct 01 14:41:31 compute-0 nova_compute[192698]: 2025-10-01 14:41:31.068 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:41:31 compute-0 openstack_network_exporter[205307]: ERROR   14:41:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:41:31 compute-0 openstack_network_exporter[205307]: ERROR   14:41:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:41:31 compute-0 openstack_network_exporter[205307]: ERROR   14:41:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:41:31 compute-0 openstack_network_exporter[205307]: ERROR   14:41:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:41:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:41:31 compute-0 openstack_network_exporter[205307]: ERROR   14:41:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:41:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:41:32 compute-0 nova_compute[192698]: 2025-10-01 14:41:32.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:41:33 compute-0 podman[234272]: 2025-10-01 14:41:33.166508842 +0000 UTC m=+0.081640619 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=iscsid)
Oct 01 14:41:33 compute-0 podman[234273]: 2025-10-01 14:41:33.176155991 +0000 UTC m=+0.081911506 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd)
Oct 01 14:41:36 compute-0 nova_compute[192698]: 2025-10-01 14:41:36.072 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:41:37 compute-0 nova_compute[192698]: 2025-10-01 14:41:37.527 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:41:38 compute-0 podman[234310]: 2025-10-01 14:41:38.151407487 +0000 UTC m=+0.064663822 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 01 14:41:38 compute-0 nova_compute[192698]: 2025-10-01 14:41:38.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:41:39 compute-0 nova_compute[192698]: 2025-10-01 14:41:39.478 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:41:39 compute-0 nova_compute[192698]: 2025-10-01 14:41:39.478 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:41:39 compute-0 nova_compute[192698]: 2025-10-01 14:41:39.478 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:41:39 compute-0 nova_compute[192698]: 2025-10-01 14:41:39.478 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 01 14:41:39 compute-0 nova_compute[192698]: 2025-10-01 14:41:39.651 2 WARNING nova.virt.libvirt.driver [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:41:39 compute-0 nova_compute[192698]: 2025-10-01 14:41:39.652 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:41:39 compute-0 nova_compute[192698]: 2025-10-01 14:41:39.701 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.049s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:41:39 compute-0 nova_compute[192698]: 2025-10-01 14:41:39.702 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5591MB free_disk=73.0272331237793GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 01 14:41:39 compute-0 nova_compute[192698]: 2025-10-01 14:41:39.702 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:41:39 compute-0 nova_compute[192698]: 2025-10-01 14:41:39.702 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:41:40 compute-0 sudo[230977]: pam_unix(sudo:session): session closed for user root
Oct 01 14:41:40 compute-0 sshd-session[230976]: Received disconnect from 192.168.122.10 port 42038:11: disconnected by user
Oct 01 14:41:40 compute-0 sshd-session[230976]: Disconnected from user zuul 192.168.122.10 port 42038
Oct 01 14:41:40 compute-0 sshd-session[230973]: pam_unix(sshd:session): session closed for user zuul
Oct 01 14:41:40 compute-0 systemd-logind[791]: Session 35 logged out. Waiting for processes to exit.
Oct 01 14:41:40 compute-0 systemd[1]: session-35.scope: Deactivated successfully.
Oct 01 14:41:40 compute-0 systemd[1]: session-35.scope: Consumed 1min 20.088s CPU time, 528.8M memory peak, read 123.9M from disk, written 25.2M to disk.
Oct 01 14:41:40 compute-0 systemd-logind[791]: Removed session 35.
Oct 01 14:41:40 compute-0 sshd-session[234336]: Accepted publickey for zuul from 192.168.122.10 port 55120 ssh2: ECDSA SHA256:G/wBH4NemtaB5A4Xrsc6R+GZmi6HC8VbviS/FKhdd8M
Oct 01 14:41:40 compute-0 systemd-logind[791]: New session 36 of user zuul.
Oct 01 14:41:40 compute-0 systemd[1]: Started Session 36 of User zuul.
Oct 01 14:41:40 compute-0 sshd-session[234336]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 01 14:41:40 compute-0 nova_compute[192698]: 2025-10-01 14:41:40.757 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 01 14:41:40 compute-0 nova_compute[192698]: 2025-10-01 14:41:40.759 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:41:39 up  1:41,  0 user,  load average: 1.12, 0.46, 0.30\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 01 14:41:40 compute-0 nova_compute[192698]: 2025-10-01 14:41:40.787 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:41:40 compute-0 sudo[234340]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/cat /var/tmp/sos-osp/sosreport-compute-0-2025-10-01-ozzfztt.tar.xz
Oct 01 14:41:40 compute-0 sudo[234340]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 14:41:40 compute-0 sudo[234340]: pam_unix(sudo:session): session closed for user root
Oct 01 14:41:40 compute-0 sshd-session[234339]: Received disconnect from 192.168.122.10 port 55120:11: disconnected by user
Oct 01 14:41:40 compute-0 sshd-session[234339]: Disconnected from user zuul 192.168.122.10 port 55120
Oct 01 14:41:40 compute-0 sshd-session[234336]: pam_unix(sshd:session): session closed for user zuul
Oct 01 14:41:40 compute-0 systemd[1]: session-36.scope: Deactivated successfully.
Oct 01 14:41:40 compute-0 systemd-logind[791]: Session 36 logged out. Waiting for processes to exit.
Oct 01 14:41:40 compute-0 systemd-logind[791]: Removed session 36.
Oct 01 14:41:41 compute-0 sshd-session[234365]: Accepted publickey for zuul from 192.168.122.10 port 55134 ssh2: ECDSA SHA256:G/wBH4NemtaB5A4Xrsc6R+GZmi6HC8VbviS/FKhdd8M
Oct 01 14:41:41 compute-0 nova_compute[192698]: 2025-10-01 14:41:41.074 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:41:41 compute-0 systemd-logind[791]: New session 37 of user zuul.
Oct 01 14:41:41 compute-0 systemd[1]: Started Session 37 of User zuul.
Oct 01 14:41:41 compute-0 sshd-session[234365]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 01 14:41:41 compute-0 sudo[234369]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rm -rf /var/tmp/sos-osp
Oct 01 14:41:41 compute-0 sudo[234369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 14:41:41 compute-0 sudo[234369]: pam_unix(sudo:session): session closed for user root
Oct 01 14:41:41 compute-0 sshd-session[234368]: Received disconnect from 192.168.122.10 port 55134:11: disconnected by user
Oct 01 14:41:41 compute-0 sshd-session[234368]: Disconnected from user zuul 192.168.122.10 port 55134
Oct 01 14:41:41 compute-0 sshd-session[234365]: pam_unix(sshd:session): session closed for user zuul
Oct 01 14:41:41 compute-0 systemd[1]: session-37.scope: Deactivated successfully.
Oct 01 14:41:41 compute-0 systemd-logind[791]: Session 37 logged out. Waiting for processes to exit.
Oct 01 14:41:41 compute-0 systemd-logind[791]: Removed session 37.
Oct 01 14:41:41 compute-0 nova_compute[192698]: 2025-10-01 14:41:41.304 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:41:41 compute-0 nova_compute[192698]: 2025-10-01 14:41:41.820 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 01 14:41:41 compute-0 nova_compute[192698]: 2025-10-01 14:41:41.821 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.119s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:41:42 compute-0 nova_compute[192698]: 2025-10-01 14:41:42.529 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:41:44 compute-0 nova_compute[192698]: 2025-10-01 14:41:44.822 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:41:44 compute-0 nova_compute[192698]: 2025-10-01 14:41:44.823 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:41:44 compute-0 nova_compute[192698]: 2025-10-01 14:41:44.823 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:41:44 compute-0 nova_compute[192698]: 2025-10-01 14:41:44.824 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:41:46 compute-0 nova_compute[192698]: 2025-10-01 14:41:46.077 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:41:46 compute-0 nova_compute[192698]: 2025-10-01 14:41:46.915 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:41:46 compute-0 nova_compute[192698]: 2025-10-01 14:41:46.916 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:41:47 compute-0 nova_compute[192698]: 2025-10-01 14:41:47.571 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:41:49 compute-0 podman[234394]: 2025-10-01 14:41:49.219362805 +0000 UTC m=+0.116517827 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Oct 01 14:41:49 compute-0 podman[234395]: 2025-10-01 14:41:49.242192369 +0000 UTC m=+0.140207874 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20250930, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct 01 14:41:49 compute-0 nova_compute[192698]: 2025-10-01 14:41:49.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:41:50 compute-0 nova_compute[192698]: 2025-10-01 14:41:50.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:41:50 compute-0 nova_compute[192698]: 2025-10-01 14:41:50.926 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 01 14:41:51 compute-0 nova_compute[192698]: 2025-10-01 14:41:51.079 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:41:51 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Oct 01 14:41:51 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct 01 14:41:52 compute-0 nova_compute[192698]: 2025-10-01 14:41:52.573 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:41:56 compute-0 nova_compute[192698]: 2025-10-01 14:41:56.081 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:41:57 compute-0 podman[234443]: 2025-10-01 14:41:57.19662589 +0000 UTC m=+0.104693205 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_id=edpm, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, distribution-scope=public, managed_by=edpm_ansible, version=9.6, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, name=ubi9-minimal)
Oct 01 14:41:57 compute-0 nova_compute[192698]: 2025-10-01 14:41:57.626 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:41:59 compute-0 podman[203144]: time="2025-10-01T14:41:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:41:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:41:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:41:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:41:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3035 "" "Go-http-client/1.1"
Oct 01 14:42:01 compute-0 nova_compute[192698]: 2025-10-01 14:42:01.083 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:42:01 compute-0 openstack_network_exporter[205307]: ERROR   14:42:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:42:01 compute-0 openstack_network_exporter[205307]: ERROR   14:42:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:42:01 compute-0 openstack_network_exporter[205307]: ERROR   14:42:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:42:01 compute-0 openstack_network_exporter[205307]: ERROR   14:42:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:42:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:42:01 compute-0 openstack_network_exporter[205307]: ERROR   14:42:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:42:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:42:02 compute-0 nova_compute[192698]: 2025-10-01 14:42:02.629 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:42:04 compute-0 podman[234466]: 2025-10-01 14:42:04.18034198 +0000 UTC m=+0.078614657 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=iscsid, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_managed=true, org.label-schema.schema-version=1.0)
Oct 01 14:42:04 compute-0 podman[234467]: 2025-10-01 14:42:04.19303377 +0000 UTC m=+0.091586894 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Oct 01 14:42:06 compute-0 nova_compute[192698]: 2025-10-01 14:42:06.087 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:42:07 compute-0 nova_compute[192698]: 2025-10-01 14:42:07.632 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:42:09 compute-0 podman[234505]: 2025-10-01 14:42:09.198384093 +0000 UTC m=+0.091239175 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 01 14:42:11 compute-0 nova_compute[192698]: 2025-10-01 14:42:11.089 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:42:12 compute-0 nova_compute[192698]: 2025-10-01 14:42:12.636 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:42:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:42:14.321 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:42:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:42:14.322 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:42:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:42:14.322 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:42:16 compute-0 nova_compute[192698]: 2025-10-01 14:42:16.091 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:42:17 compute-0 nova_compute[192698]: 2025-10-01 14:42:17.639 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:42:20 compute-0 podman[234530]: 2025-10-01 14:42:20.199475774 +0000 UTC m=+0.094340758 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.4)
Oct 01 14:42:20 compute-0 podman[234531]: 2025-10-01 14:42:20.225946313 +0000 UTC m=+0.104781458 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20250930)
Oct 01 14:42:21 compute-0 nova_compute[192698]: 2025-10-01 14:42:21.094 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:42:22 compute-0 nova_compute[192698]: 2025-10-01 14:42:22.641 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:42:26 compute-0 nova_compute[192698]: 2025-10-01 14:42:26.096 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:42:27 compute-0 nova_compute[192698]: 2025-10-01 14:42:27.643 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:42:28 compute-0 podman[234573]: 2025-10-01 14:42:28.194592302 +0000 UTC m=+0.098880789 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, maintainer=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, container_name=openstack_network_exporter)
Oct 01 14:42:29 compute-0 podman[203144]: time="2025-10-01T14:42:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:42:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:42:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:42:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:42:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3028 "" "Go-http-client/1.1"
Oct 01 14:42:31 compute-0 nova_compute[192698]: 2025-10-01 14:42:31.098 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:42:31 compute-0 openstack_network_exporter[205307]: ERROR   14:42:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:42:31 compute-0 openstack_network_exporter[205307]: ERROR   14:42:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:42:31 compute-0 openstack_network_exporter[205307]: ERROR   14:42:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:42:31 compute-0 openstack_network_exporter[205307]: ERROR   14:42:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:42:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:42:31 compute-0 openstack_network_exporter[205307]: ERROR   14:42:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:42:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:42:32 compute-0 nova_compute[192698]: 2025-10-01 14:42:32.646 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:42:35 compute-0 podman[234595]: 2025-10-01 14:42:35.177635305 +0000 UTC m=+0.085191203 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, tcib_managed=true, container_name=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid)
Oct 01 14:42:35 compute-0 podman[234596]: 2025-10-01 14:42:35.206075837 +0000 UTC m=+0.098305444 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20250930)
Oct 01 14:42:36 compute-0 nova_compute[192698]: 2025-10-01 14:42:36.099 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:42:37 compute-0 nova_compute[192698]: 2025-10-01 14:42:37.649 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:42:40 compute-0 podman[234635]: 2025-10-01 14:42:40.204701548 +0000 UTC m=+0.094336678 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 01 14:42:40 compute-0 nova_compute[192698]: 2025-10-01 14:42:40.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:42:41 compute-0 nova_compute[192698]: 2025-10-01 14:42:41.104 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:42:41 compute-0 nova_compute[192698]: 2025-10-01 14:42:41.445 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:42:41 compute-0 nova_compute[192698]: 2025-10-01 14:42:41.446 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:42:41 compute-0 nova_compute[192698]: 2025-10-01 14:42:41.446 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:42:41 compute-0 nova_compute[192698]: 2025-10-01 14:42:41.446 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 01 14:42:41 compute-0 nova_compute[192698]: 2025-10-01 14:42:41.632 2 WARNING nova.virt.libvirt.driver [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:42:41 compute-0 nova_compute[192698]: 2025-10-01 14:42:41.633 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:42:41 compute-0 nova_compute[192698]: 2025-10-01 14:42:41.663 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.030s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:42:41 compute-0 nova_compute[192698]: 2025-10-01 14:42:41.664 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5786MB free_disk=73.2930679321289GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 01 14:42:41 compute-0 nova_compute[192698]: 2025-10-01 14:42:41.665 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:42:41 compute-0 nova_compute[192698]: 2025-10-01 14:42:41.665 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:42:42 compute-0 nova_compute[192698]: 2025-10-01 14:42:42.651 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:42:42 compute-0 nova_compute[192698]: 2025-10-01 14:42:42.731 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 01 14:42:42 compute-0 nova_compute[192698]: 2025-10-01 14:42:42.731 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:42:41 up  1:42,  0 user,  load average: 0.55, 0.45, 0.31\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 01 14:42:42 compute-0 nova_compute[192698]: 2025-10-01 14:42:42.752 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:42:43 compute-0 nova_compute[192698]: 2025-10-01 14:42:43.259 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:42:43 compute-0 nova_compute[192698]: 2025-10-01 14:42:43.772 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 01 14:42:43 compute-0 nova_compute[192698]: 2025-10-01 14:42:43.772 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.107s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:42:44 compute-0 nova_compute[192698]: 2025-10-01 14:42:44.773 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:42:44 compute-0 nova_compute[192698]: 2025-10-01 14:42:44.773 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:42:44 compute-0 nova_compute[192698]: 2025-10-01 14:42:44.774 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:42:44 compute-0 nova_compute[192698]: 2025-10-01 14:42:44.774 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:42:46 compute-0 nova_compute[192698]: 2025-10-01 14:42:46.106 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:42:47 compute-0 nova_compute[192698]: 2025-10-01 14:42:47.654 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:42:47 compute-0 nova_compute[192698]: 2025-10-01 14:42:47.915 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:42:49 compute-0 nova_compute[192698]: 2025-10-01 14:42:49.924 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:42:51 compute-0 nova_compute[192698]: 2025-10-01 14:42:51.108 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:42:51 compute-0 podman[234661]: 2025-10-01 14:42:51.185817254 +0000 UTC m=+0.089001535 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Oct 01 14:42:51 compute-0 podman[234662]: 2025-10-01 14:42:51.237171129 +0000 UTC m=+0.136421535 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct 01 14:42:51 compute-0 nova_compute[192698]: 2025-10-01 14:42:51.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:42:51 compute-0 nova_compute[192698]: 2025-10-01 14:42:51.926 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 01 14:42:52 compute-0 nova_compute[192698]: 2025-10-01 14:42:52.656 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:42:56 compute-0 nova_compute[192698]: 2025-10-01 14:42:56.111 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:42:57 compute-0 nova_compute[192698]: 2025-10-01 14:42:57.657 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:42:59 compute-0 podman[234706]: 2025-10-01 14:42:59.155128062 +0000 UTC m=+0.071370482 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, release=1755695350, distribution-scope=public, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., managed_by=edpm_ansible, architecture=x86_64, container_name=openstack_network_exporter, vcs-type=git, io.openshift.expose-services=, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm)
Oct 01 14:42:59 compute-0 podman[203144]: time="2025-10-01T14:42:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:42:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:42:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:42:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:42:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3029 "" "Go-http-client/1.1"
Oct 01 14:43:01 compute-0 nova_compute[192698]: 2025-10-01 14:43:01.112 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:43:01 compute-0 openstack_network_exporter[205307]: ERROR   14:43:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:43:01 compute-0 openstack_network_exporter[205307]: ERROR   14:43:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:43:01 compute-0 openstack_network_exporter[205307]: ERROR   14:43:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:43:01 compute-0 openstack_network_exporter[205307]: ERROR   14:43:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:43:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:43:01 compute-0 openstack_network_exporter[205307]: ERROR   14:43:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:43:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:43:02 compute-0 nova_compute[192698]: 2025-10-01 14:43:02.660 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:43:06 compute-0 nova_compute[192698]: 2025-10-01 14:43:06.114 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:43:06 compute-0 podman[234730]: 2025-10-01 14:43:06.196571429 +0000 UTC m=+0.098726515 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Oct 01 14:43:06 compute-0 podman[234729]: 2025-10-01 14:43:06.200513024 +0000 UTC m=+0.099102385 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=iscsid, org.label-schema.build-date=20250930)
Oct 01 14:43:07 compute-0 nova_compute[192698]: 2025-10-01 14:43:07.666 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:43:11 compute-0 nova_compute[192698]: 2025-10-01 14:43:11.118 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:43:11 compute-0 podman[234772]: 2025-10-01 14:43:11.179514661 +0000 UTC m=+0.070692824 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct 01 14:43:12 compute-0 nova_compute[192698]: 2025-10-01 14:43:12.670 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:43:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:43:14.323 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:43:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:43:14.324 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:43:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:43:14.324 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:43:16 compute-0 nova_compute[192698]: 2025-10-01 14:43:16.121 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:43:17 compute-0 nova_compute[192698]: 2025-10-01 14:43:17.672 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:43:21 compute-0 nova_compute[192698]: 2025-10-01 14:43:21.126 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:43:22 compute-0 podman[234798]: 2025-10-01 14:43:22.19546668 +0000 UTC m=+0.093954148 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Oct 01 14:43:22 compute-0 podman[234799]: 2025-10-01 14:43:22.23466589 +0000 UTC m=+0.112862714 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Oct 01 14:43:22 compute-0 nova_compute[192698]: 2025-10-01 14:43:22.675 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:43:26 compute-0 nova_compute[192698]: 2025-10-01 14:43:26.127 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:43:27 compute-0 nova_compute[192698]: 2025-10-01 14:43:27.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:43:29 compute-0 podman[203144]: time="2025-10-01T14:43:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:43:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:43:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:43:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:43:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3031 "" "Go-http-client/1.1"
Oct 01 14:43:30 compute-0 podman[234842]: 2025-10-01 14:43:30.151167413 +0000 UTC m=+0.062018002 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.7, release=1755695350, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal)
Oct 01 14:43:31 compute-0 nova_compute[192698]: 2025-10-01 14:43:31.166 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:43:31 compute-0 openstack_network_exporter[205307]: ERROR   14:43:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:43:31 compute-0 openstack_network_exporter[205307]: ERROR   14:43:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:43:31 compute-0 openstack_network_exporter[205307]: ERROR   14:43:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:43:31 compute-0 openstack_network_exporter[205307]: ERROR   14:43:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:43:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:43:31 compute-0 openstack_network_exporter[205307]: ERROR   14:43:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:43:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:43:32 compute-0 nova_compute[192698]: 2025-10-01 14:43:32.729 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:43:36 compute-0 nova_compute[192698]: 2025-10-01 14:43:36.168 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:43:37 compute-0 podman[234863]: 2025-10-01 14:43:37.147764898 +0000 UTC m=+0.064611441 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0)
Oct 01 14:43:37 compute-0 podman[234864]: 2025-10-01 14:43:37.176118598 +0000 UTC m=+0.076908441 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 01 14:43:37 compute-0 nova_compute[192698]: 2025-10-01 14:43:37.756 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:43:40 compute-0 nova_compute[192698]: 2025-10-01 14:43:40.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:43:41 compute-0 nova_compute[192698]: 2025-10-01 14:43:41.172 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:43:41 compute-0 nova_compute[192698]: 2025-10-01 14:43:41.444 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:43:41 compute-0 nova_compute[192698]: 2025-10-01 14:43:41.445 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:43:41 compute-0 nova_compute[192698]: 2025-10-01 14:43:41.445 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:43:41 compute-0 nova_compute[192698]: 2025-10-01 14:43:41.446 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 01 14:43:41 compute-0 nova_compute[192698]: 2025-10-01 14:43:41.656 2 WARNING nova.virt.libvirt.driver [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:43:41 compute-0 nova_compute[192698]: 2025-10-01 14:43:41.658 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:43:41 compute-0 nova_compute[192698]: 2025-10-01 14:43:41.686 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.028s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:43:41 compute-0 nova_compute[192698]: 2025-10-01 14:43:41.688 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5834MB free_disk=73.2930679321289GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 01 14:43:41 compute-0 nova_compute[192698]: 2025-10-01 14:43:41.688 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:43:41 compute-0 nova_compute[192698]: 2025-10-01 14:43:41.689 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:43:42 compute-0 podman[234906]: 2025-10-01 14:43:42.178888442 +0000 UTC m=+0.079266474 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 01 14:43:42 compute-0 nova_compute[192698]: 2025-10-01 14:43:42.759 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:43:42 compute-0 nova_compute[192698]: 2025-10-01 14:43:42.819 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 01 14:43:42 compute-0 nova_compute[192698]: 2025-10-01 14:43:42.820 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:43:41 up  1:43,  0 user,  load average: 0.20, 0.36, 0.28\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 01 14:43:42 compute-0 nova_compute[192698]: 2025-10-01 14:43:42.840 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Refreshing inventories for resource provider ee1e54f5-453b-4949-a499-9a192f03b8f0 _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Oct 01 14:43:42 compute-0 nova_compute[192698]: 2025-10-01 14:43:42.855 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Updating ProviderTree inventory for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Oct 01 14:43:42 compute-0 nova_compute[192698]: 2025-10-01 14:43:42.855 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Updating inventory in ProviderTree for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Oct 01 14:43:42 compute-0 nova_compute[192698]: 2025-10-01 14:43:42.867 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Refreshing aggregate associations for resource provider ee1e54f5-453b-4949-a499-9a192f03b8f0, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Oct 01 14:43:42 compute-0 nova_compute[192698]: 2025-10-01 14:43:42.885 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Refreshing trait associations for resource provider ee1e54f5-453b-4949-a499-9a192f03b8f0, traits: COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_TPM_TIS,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_BMI2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_AVX2,COMPUTE_ARCH_X86_64,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SHA,COMPUTE_SOUND_MODEL_AC97,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_SOUND_MODEL_ES1370,HW_ARCH_X86_64,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_DEVICE_TAGGING,COMPUTE_SECURITY_TPM_CRB,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SOUND_MODEL_SB16,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SOUND_MODEL_USB,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_ACCELERATORS,HW_CPU_X86_CLMUL,HW_CPU_X86_AESNI,COMPUTE_NODE,HW_CPU_X86_SSSE3,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_RESCUE_BFV,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_FMA3,COMPUTE_SOUND_MODEL_PCSPK,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_AVX,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_F16C,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_ABM,COMPUTE_SOUND_MODEL_ICH6,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_VIRTIO_FS,HW_CPU_X86_SSE2,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_IGB,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE4A,HW_CPU_X86_SVM _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Oct 01 14:43:42 compute-0 nova_compute[192698]: 2025-10-01 14:43:42.908 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:43:43 compute-0 nova_compute[192698]: 2025-10-01 14:43:43.416 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:43:43 compute-0 nova_compute[192698]: 2025-10-01 14:43:43.929 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 01 14:43:43 compute-0 nova_compute[192698]: 2025-10-01 14:43:43.930 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.241s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:43:46 compute-0 nova_compute[192698]: 2025-10-01 14:43:46.172 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:43:46 compute-0 nova_compute[192698]: 2025-10-01 14:43:46.930 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:43:46 compute-0 nova_compute[192698]: 2025-10-01 14:43:46.931 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:43:46 compute-0 nova_compute[192698]: 2025-10-01 14:43:46.931 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:43:46 compute-0 nova_compute[192698]: 2025-10-01 14:43:46.932 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:43:47 compute-0 nova_compute[192698]: 2025-10-01 14:43:47.761 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:43:47 compute-0 nova_compute[192698]: 2025-10-01 14:43:47.914 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:43:49 compute-0 nova_compute[192698]: 2025-10-01 14:43:49.913 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:43:50 compute-0 nova_compute[192698]: 2025-10-01 14:43:50.425 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:43:51 compute-0 nova_compute[192698]: 2025-10-01 14:43:51.174 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:43:52 compute-0 nova_compute[192698]: 2025-10-01 14:43:52.764 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:43:53 compute-0 podman[234930]: 2025-10-01 14:43:53.204544873 +0000 UTC m=+0.106489223 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct 01 14:43:53 compute-0 podman[234931]: 2025-10-01 14:43:53.221830566 +0000 UTC m=+0.122159033 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Oct 01 14:43:53 compute-0 nova_compute[192698]: 2025-10-01 14:43:53.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:43:53 compute-0 nova_compute[192698]: 2025-10-01 14:43:53.925 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 01 14:43:56 compute-0 nova_compute[192698]: 2025-10-01 14:43:56.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:43:56 compute-0 nova_compute[192698]: 2025-10-01 14:43:56.926 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:43:56 compute-0 nova_compute[192698]: 2025-10-01 14:43:56.927 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Oct 01 14:43:57 compute-0 nova_compute[192698]: 2025-10-01 14:43:57.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:43:59 compute-0 podman[203144]: time="2025-10-01T14:43:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:43:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:43:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:43:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:43:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3031 "" "Go-http-client/1.1"
Oct 01 14:44:01 compute-0 podman[234974]: 2025-10-01 14:44:01.151253366 +0000 UTC m=+0.067798757 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=minimal rhel9, config_id=edpm, maintainer=Red Hat, Inc., version=9.6, io.openshift.expose-services=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, build-date=2025-08-20T13:12:41, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Oct 01 14:44:01 compute-0 nova_compute[192698]: 2025-10-01 14:44:01.201 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:44:01 compute-0 openstack_network_exporter[205307]: ERROR   14:44:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:44:01 compute-0 openstack_network_exporter[205307]: ERROR   14:44:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:44:01 compute-0 openstack_network_exporter[205307]: ERROR   14:44:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:44:01 compute-0 openstack_network_exporter[205307]: ERROR   14:44:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:44:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:44:01 compute-0 openstack_network_exporter[205307]: ERROR   14:44:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:44:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:44:02 compute-0 nova_compute[192698]: 2025-10-01 14:44:02.767 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:44:03 compute-0 unix_chkpwd[234996]: password check failed for user (root)
Oct 01 14:44:03 compute-0 sshd-session[234994]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.103  user=root
Oct 01 14:44:05 compute-0 sshd-session[234994]: Failed password for root from 193.46.255.103 port 54192 ssh2
Oct 01 14:44:05 compute-0 nova_compute[192698]: 2025-10-01 14:44:05.432 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:44:05 compute-0 nova_compute[192698]: 2025-10-01 14:44:05.432 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Oct 01 14:44:05 compute-0 nova_compute[192698]: 2025-10-01 14:44:05.941 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Oct 01 14:44:06 compute-0 nova_compute[192698]: 2025-10-01 14:44:06.201 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:44:06 compute-0 unix_chkpwd[234997]: password check failed for user (root)
Oct 01 14:44:07 compute-0 nova_compute[192698]: 2025-10-01 14:44:07.803 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:44:07 compute-0 nova_compute[192698]: 2025-10-01 14:44:07.926 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:44:08 compute-0 podman[234998]: 2025-10-01 14:44:08.191952252 +0000 UTC m=+0.094882382 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct 01 14:44:08 compute-0 podman[234999]: 2025-10-01 14:44:08.193766761 +0000 UTC m=+0.092982022 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest)
Oct 01 14:44:08 compute-0 sshd-session[234994]: Failed password for root from 193.46.255.103 port 54192 ssh2
Oct 01 14:44:09 compute-0 unix_chkpwd[235040]: password check failed for user (root)
Oct 01 14:44:11 compute-0 nova_compute[192698]: 2025-10-01 14:44:11.205 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:44:11 compute-0 sshd-session[234994]: Failed password for root from 193.46.255.103 port 54192 ssh2
Oct 01 14:44:11 compute-0 sshd-session[234994]: Received disconnect from 193.46.255.103 port 54192:11:  [preauth]
Oct 01 14:44:11 compute-0 sshd-session[234994]: Disconnected from authenticating user root 193.46.255.103 port 54192 [preauth]
Oct 01 14:44:11 compute-0 sshd-session[234994]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.103  user=root
Oct 01 14:44:12 compute-0 nova_compute[192698]: 2025-10-01 14:44:12.805 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:44:12 compute-0 unix_chkpwd[235043]: password check failed for user (root)
Oct 01 14:44:12 compute-0 sshd-session[235041]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.103  user=root
Oct 01 14:44:13 compute-0 podman[235044]: 2025-10-01 14:44:13.158688592 +0000 UTC m=+0.071873056 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 01 14:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:44:14.325 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:44:14.326 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:44:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:44:14.326 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:44:14 compute-0 sshd-session[235041]: Failed password for root from 193.46.255.103 port 54208 ssh2
Oct 01 14:44:15 compute-0 unix_chkpwd[235070]: password check failed for user (root)
Oct 01 14:44:16 compute-0 nova_compute[192698]: 2025-10-01 14:44:16.207 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:44:17 compute-0 sshd-session[235041]: Failed password for root from 193.46.255.103 port 54208 ssh2
Oct 01 14:44:17 compute-0 nova_compute[192698]: 2025-10-01 14:44:17.839 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:44:18 compute-0 unix_chkpwd[235071]: password check failed for user (root)
Oct 01 14:44:20 compute-0 sshd-session[235041]: Failed password for root from 193.46.255.103 port 54208 ssh2
Oct 01 14:44:21 compute-0 nova_compute[192698]: 2025-10-01 14:44:21.236 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:44:21 compute-0 sshd-session[235041]: Received disconnect from 193.46.255.103 port 54208:11:  [preauth]
Oct 01 14:44:21 compute-0 sshd-session[235041]: Disconnected from authenticating user root 193.46.255.103 port 54208 [preauth]
Oct 01 14:44:21 compute-0 sshd-session[235041]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.103  user=root
Oct 01 14:44:22 compute-0 unix_chkpwd[235074]: password check failed for user (root)
Oct 01 14:44:22 compute-0 sshd-session[235072]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.103  user=root
Oct 01 14:44:22 compute-0 nova_compute[192698]: 2025-10-01 14:44:22.842 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:44:24 compute-0 podman[235075]: 2025-10-01 14:44:24.158179491 +0000 UTC m=+0.067596331 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 01 14:44:24 compute-0 sshd-session[235072]: Failed password for root from 193.46.255.103 port 37756 ssh2
Oct 01 14:44:24 compute-0 podman[235076]: 2025-10-01 14:44:24.219786751 +0000 UTC m=+0.121001122 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2, container_name=ovn_controller)
Oct 01 14:44:25 compute-0 unix_chkpwd[235121]: password check failed for user (root)
Oct 01 14:44:26 compute-0 nova_compute[192698]: 2025-10-01 14:44:26.237 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:44:27 compute-0 sshd-session[235072]: Failed password for root from 193.46.255.103 port 37756 ssh2
Oct 01 14:44:27 compute-0 nova_compute[192698]: 2025-10-01 14:44:27.845 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:44:28 compute-0 unix_chkpwd[235122]: password check failed for user (root)
Oct 01 14:44:29 compute-0 podman[203144]: time="2025-10-01T14:44:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:44:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:44:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:44:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:44:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3029 "" "Go-http-client/1.1"
Oct 01 14:44:30 compute-0 sshd-session[235072]: Failed password for root from 193.46.255.103 port 37756 ssh2
Oct 01 14:44:30 compute-0 sshd-session[235072]: Received disconnect from 193.46.255.103 port 37756:11:  [preauth]
Oct 01 14:44:30 compute-0 sshd-session[235072]: Disconnected from authenticating user root 193.46.255.103 port 37756 [preauth]
Oct 01 14:44:30 compute-0 sshd-session[235072]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.103  user=root
Oct 01 14:44:31 compute-0 nova_compute[192698]: 2025-10-01 14:44:31.240 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:44:31 compute-0 openstack_network_exporter[205307]: ERROR   14:44:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:44:31 compute-0 openstack_network_exporter[205307]: ERROR   14:44:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:44:31 compute-0 openstack_network_exporter[205307]: ERROR   14:44:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:44:31 compute-0 openstack_network_exporter[205307]: ERROR   14:44:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:44:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:44:31 compute-0 openstack_network_exporter[205307]: ERROR   14:44:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:44:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:44:32 compute-0 podman[235123]: 2025-10-01 14:44:32.170051098 +0000 UTC m=+0.089005155 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, config_id=edpm, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.openshift.expose-services=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc.)
Oct 01 14:44:32 compute-0 nova_compute[192698]: 2025-10-01 14:44:32.847 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:44:36 compute-0 nova_compute[192698]: 2025-10-01 14:44:36.243 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:44:37 compute-0 nova_compute[192698]: 2025-10-01 14:44:37.898 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:44:39 compute-0 podman[235144]: 2025-10-01 14:44:39.179795545 +0000 UTC m=+0.082884571 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=iscsid, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct 01 14:44:39 compute-0 podman[235145]: 2025-10-01 14:44:39.210552039 +0000 UTC m=+0.102293031 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd)
Oct 01 14:44:41 compute-0 nova_compute[192698]: 2025-10-01 14:44:41.260 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:44:42 compute-0 nova_compute[192698]: 2025-10-01 14:44:42.432 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:44:42 compute-0 nova_compute[192698]: 2025-10-01 14:44:42.901 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:44:42 compute-0 nova_compute[192698]: 2025-10-01 14:44:42.962 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:44:42 compute-0 nova_compute[192698]: 2025-10-01 14:44:42.963 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:44:42 compute-0 nova_compute[192698]: 2025-10-01 14:44:42.964 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:44:42 compute-0 nova_compute[192698]: 2025-10-01 14:44:42.964 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 01 14:44:43 compute-0 nova_compute[192698]: 2025-10-01 14:44:43.152 2 WARNING nova.virt.libvirt.driver [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:44:43 compute-0 nova_compute[192698]: 2025-10-01 14:44:43.154 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:44:43 compute-0 nova_compute[192698]: 2025-10-01 14:44:43.193 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.039s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:44:43 compute-0 nova_compute[192698]: 2025-10-01 14:44:43.193 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5822MB free_disk=73.29328918457031GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 01 14:44:43 compute-0 nova_compute[192698]: 2025-10-01 14:44:43.194 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:44:43 compute-0 nova_compute[192698]: 2025-10-01 14:44:43.194 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:44:44 compute-0 podman[235186]: 2025-10-01 14:44:44.153574922 +0000 UTC m=+0.069558894 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct 01 14:44:44 compute-0 nova_compute[192698]: 2025-10-01 14:44:44.372 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 01 14:44:44 compute-0 nova_compute[192698]: 2025-10-01 14:44:44.372 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:44:43 up  1:44,  0 user,  load average: 0.07, 0.30, 0.26\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 01 14:44:44 compute-0 nova_compute[192698]: 2025-10-01 14:44:44.394 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:44:44 compute-0 nova_compute[192698]: 2025-10-01 14:44:44.906 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:44:45 compute-0 nova_compute[192698]: 2025-10-01 14:44:45.420 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 01 14:44:45 compute-0 nova_compute[192698]: 2025-10-01 14:44:45.421 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.226s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:44:46 compute-0 nova_compute[192698]: 2025-10-01 14:44:46.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:44:46 compute-0 nova_compute[192698]: 2025-10-01 14:44:46.922 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:44:46 compute-0 nova_compute[192698]: 2025-10-01 14:44:46.923 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:44:46 compute-0 nova_compute[192698]: 2025-10-01 14:44:46.924 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:44:46 compute-0 nova_compute[192698]: 2025-10-01 14:44:46.924 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:44:47 compute-0 nova_compute[192698]: 2025-10-01 14:44:47.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:44:47 compute-0 nova_compute[192698]: 2025-10-01 14:44:47.915 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:44:49 compute-0 nova_compute[192698]: 2025-10-01 14:44:49.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:44:51 compute-0 nova_compute[192698]: 2025-10-01 14:44:51.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:44:52 compute-0 nova_compute[192698]: 2025-10-01 14:44:52.907 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:44:54 compute-0 nova_compute[192698]: 2025-10-01 14:44:54.926 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:44:54 compute-0 nova_compute[192698]: 2025-10-01 14:44:54.926 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 01 14:44:55 compute-0 podman[235210]: 2025-10-01 14:44:55.16794897 +0000 UTC m=+0.079435769 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct 01 14:44:55 compute-0 podman[235211]: 2025-10-01 14:44:55.221107014 +0000 UTC m=+0.131603326 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, config_id=ovn_controller, container_name=ovn_controller)
Oct 01 14:44:56 compute-0 nova_compute[192698]: 2025-10-01 14:44:56.268 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:44:57 compute-0 nova_compute[192698]: 2025-10-01 14:44:57.945 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:44:59 compute-0 podman[203144]: time="2025-10-01T14:44:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:44:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:44:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:44:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:44:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3028 "" "Go-http-client/1.1"
Oct 01 14:45:01 compute-0 nova_compute[192698]: 2025-10-01 14:45:01.271 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:45:01 compute-0 openstack_network_exporter[205307]: ERROR   14:45:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:45:01 compute-0 openstack_network_exporter[205307]: ERROR   14:45:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:45:01 compute-0 openstack_network_exporter[205307]: ERROR   14:45:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:45:01 compute-0 openstack_network_exporter[205307]: ERROR   14:45:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:45:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:45:01 compute-0 openstack_network_exporter[205307]: ERROR   14:45:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:45:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:45:02 compute-0 nova_compute[192698]: 2025-10-01 14:45:02.105 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:45:02 compute-0 nova_compute[192698]: 2025-10-01 14:45:02.949 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:45:03 compute-0 podman[235253]: 2025-10-01 14:45:03.14852941 +0000 UTC m=+0.062462964 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, architecture=x86_64, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, distribution-scope=public, version=9.6, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.)
Oct 01 14:45:06 compute-0 nova_compute[192698]: 2025-10-01 14:45:06.276 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:45:07 compute-0 nova_compute[192698]: 2025-10-01 14:45:07.995 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:45:10 compute-0 podman[235276]: 2025-10-01 14:45:10.188406824 +0000 UTC m=+0.097953544 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct 01 14:45:10 compute-0 podman[235277]: 2025-10-01 14:45:10.205747108 +0000 UTC m=+0.105996919 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_id=multipathd, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=multipathd, io.buildah.version=1.41.4)
Oct 01 14:45:11 compute-0 nova_compute[192698]: 2025-10-01 14:45:11.278 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:45:13 compute-0 nova_compute[192698]: 2025-10-01 14:45:13.025 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:45:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:45:14.327 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:45:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:45:14.328 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:45:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:45:14.328 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:45:15 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:45:15.002 103791 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=37, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'e2:3f:3c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '4a:1d:a6:67:ed:e6'}, ipsec=False) old=SB_Global(nb_cfg=36) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Oct 01 14:45:15 compute-0 nova_compute[192698]: 2025-10-01 14:45:15.002 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:45:15 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:45:15.004 103791 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Oct 01 14:45:15 compute-0 podman[235316]: 2025-10-01 14:45:15.165695525 +0000 UTC m=+0.072528294 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 01 14:45:16 compute-0 nova_compute[192698]: 2025-10-01 14:45:16.280 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:45:18 compute-0 nova_compute[192698]: 2025-10-01 14:45:18.028 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:45:21 compute-0 nova_compute[192698]: 2025-10-01 14:45:21.283 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:45:23 compute-0 nova_compute[192698]: 2025-10-01 14:45:23.049 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:45:24 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:45:24.006 103791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=10cf9814-09fa-4bad-879a-270f9b64eda3, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '37'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Oct 01 14:45:26 compute-0 podman[235342]: 2025-10-01 14:45:26.15915507 +0000 UTC m=+0.067392356 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Oct 01 14:45:26 compute-0 podman[235343]: 2025-10-01 14:45:26.216428164 +0000 UTC m=+0.125399260 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Oct 01 14:45:26 compute-0 nova_compute[192698]: 2025-10-01 14:45:26.287 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:45:28 compute-0 nova_compute[192698]: 2025-10-01 14:45:28.051 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:45:29 compute-0 podman[203144]: time="2025-10-01T14:45:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:45:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:45:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:45:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:45:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3033 "" "Go-http-client/1.1"
Oct 01 14:45:31 compute-0 nova_compute[192698]: 2025-10-01 14:45:31.290 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:45:31 compute-0 openstack_network_exporter[205307]: ERROR   14:45:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:45:31 compute-0 openstack_network_exporter[205307]: ERROR   14:45:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:45:31 compute-0 openstack_network_exporter[205307]: ERROR   14:45:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:45:31 compute-0 openstack_network_exporter[205307]: ERROR   14:45:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:45:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:45:31 compute-0 openstack_network_exporter[205307]: ERROR   14:45:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:45:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:45:33 compute-0 nova_compute[192698]: 2025-10-01 14:45:33.054 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:45:34 compute-0 podman[235387]: 2025-10-01 14:45:34.192145743 +0000 UTC m=+0.101442278 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.openshift.expose-services=, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, managed_by=edpm_ansible, config_id=edpm, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct 01 14:45:36 compute-0 nova_compute[192698]: 2025-10-01 14:45:36.289 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:45:38 compute-0 nova_compute[192698]: 2025-10-01 14:45:38.056 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:45:41 compute-0 podman[235410]: 2025-10-01 14:45:41.190125195 +0000 UTC m=+0.088911762 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.4)
Oct 01 14:45:41 compute-0 podman[235409]: 2025-10-01 14:45:41.221354492 +0000 UTC m=+0.122672807 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct 01 14:45:41 compute-0 nova_compute[192698]: 2025-10-01 14:45:41.338 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:45:42 compute-0 nova_compute[192698]: 2025-10-01 14:45:42.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:45:43 compute-0 nova_compute[192698]: 2025-10-01 14:45:43.058 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:45:43 compute-0 nova_compute[192698]: 2025-10-01 14:45:43.455 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:45:43 compute-0 nova_compute[192698]: 2025-10-01 14:45:43.455 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:45:43 compute-0 nova_compute[192698]: 2025-10-01 14:45:43.456 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:45:43 compute-0 nova_compute[192698]: 2025-10-01 14:45:43.456 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 01 14:45:43 compute-0 nova_compute[192698]: 2025-10-01 14:45:43.674 2 WARNING nova.virt.libvirt.driver [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:45:43 compute-0 nova_compute[192698]: 2025-10-01 14:45:43.675 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:45:43 compute-0 nova_compute[192698]: 2025-10-01 14:45:43.716 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.041s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:45:43 compute-0 nova_compute[192698]: 2025-10-01 14:45:43.717 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5824MB free_disk=73.29328918457031GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 01 14:45:43 compute-0 nova_compute[192698]: 2025-10-01 14:45:43.718 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:45:43 compute-0 nova_compute[192698]: 2025-10-01 14:45:43.718 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:45:44 compute-0 nova_compute[192698]: 2025-10-01 14:45:44.807 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 01 14:45:44 compute-0 nova_compute[192698]: 2025-10-01 14:45:44.808 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:45:43 up  1:45,  0 user,  load average: 0.02, 0.24, 0.25\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 01 14:45:44 compute-0 nova_compute[192698]: 2025-10-01 14:45:44.839 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:45:45 compute-0 nova_compute[192698]: 2025-10-01 14:45:45.351 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:45:45 compute-0 podman[235447]: 2025-10-01 14:45:45.3984283 +0000 UTC m=+0.069378490 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 01 14:45:45 compute-0 nova_compute[192698]: 2025-10-01 14:45:45.865 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 01 14:45:45 compute-0 nova_compute[192698]: 2025-10-01 14:45:45.866 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.147s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:45:46 compute-0 nova_compute[192698]: 2025-10-01 14:45:46.340 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:45:46 compute-0 nova_compute[192698]: 2025-10-01 14:45:46.866 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:45:46 compute-0 nova_compute[192698]: 2025-10-01 14:45:46.866 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:45:46 compute-0 nova_compute[192698]: 2025-10-01 14:45:46.867 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:45:46 compute-0 nova_compute[192698]: 2025-10-01 14:45:46.926 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:45:48 compute-0 nova_compute[192698]: 2025-10-01 14:45:48.098 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:45:48 compute-0 nova_compute[192698]: 2025-10-01 14:45:48.914 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:45:50 compute-0 nova_compute[192698]: 2025-10-01 14:45:50.914 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:45:51 compute-0 nova_compute[192698]: 2025-10-01 14:45:51.342 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:45:51 compute-0 nova_compute[192698]: 2025-10-01 14:45:51.424 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:45:53 compute-0 nova_compute[192698]: 2025-10-01 14:45:53.147 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:45:55 compute-0 nova_compute[192698]: 2025-10-01 14:45:55.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:45:55 compute-0 nova_compute[192698]: 2025-10-01 14:45:55.926 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 01 14:45:56 compute-0 nova_compute[192698]: 2025-10-01 14:45:56.344 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:45:57 compute-0 podman[235470]: 2025-10-01 14:45:57.16675984 +0000 UTC m=+0.081301369 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4)
Oct 01 14:45:57 compute-0 podman[235471]: 2025-10-01 14:45:57.223685024 +0000 UTC m=+0.125699167 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true)
Oct 01 14:45:58 compute-0 nova_compute[192698]: 2025-10-01 14:45:58.185 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:45:59 compute-0 podman[203144]: time="2025-10-01T14:45:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:45:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:45:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:45:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:45:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3037 "" "Go-http-client/1.1"
Oct 01 14:46:01 compute-0 nova_compute[192698]: 2025-10-01 14:46:01.347 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:46:01 compute-0 openstack_network_exporter[205307]: ERROR   14:46:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:46:01 compute-0 openstack_network_exporter[205307]: ERROR   14:46:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:46:01 compute-0 openstack_network_exporter[205307]: ERROR   14:46:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:46:01 compute-0 openstack_network_exporter[205307]: ERROR   14:46:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:46:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:46:01 compute-0 openstack_network_exporter[205307]: ERROR   14:46:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:46:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:46:03 compute-0 nova_compute[192698]: 2025-10-01 14:46:03.223 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:46:05 compute-0 podman[235516]: 2025-10-01 14:46:05.144383341 +0000 UTC m=+0.061354435 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, name=ubi9-minimal, vcs-type=git, container_name=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6)
Oct 01 14:46:06 compute-0 sshd-session[235538]: banner exchange: Connection from 195.178.110.15 port 54050: invalid format
Oct 01 14:46:06 compute-0 sshd-session[235539]: banner exchange: Connection from 195.178.110.15 port 54062: invalid format
Oct 01 14:46:06 compute-0 nova_compute[192698]: 2025-10-01 14:46:06.347 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:46:08 compute-0 nova_compute[192698]: 2025-10-01 14:46:08.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:46:11 compute-0 nova_compute[192698]: 2025-10-01 14:46:11.349 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:46:12 compute-0 podman[235540]: 2025-10-01 14:46:12.176595771 +0000 UTC m=+0.086052356 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team)
Oct 01 14:46:12 compute-0 podman[235541]: 2025-10-01 14:46:12.183588548 +0000 UTC m=+0.084121414 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true)
Oct 01 14:46:13 compute-0 nova_compute[192698]: 2025-10-01 14:46:13.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:46:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:46:14.329 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:46:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:46:14.330 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:46:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:46:14.330 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:46:16 compute-0 podman[235579]: 2025-10-01 14:46:16.140820327 +0000 UTC m=+0.060436041 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct 01 14:46:16 compute-0 nova_compute[192698]: 2025-10-01 14:46:16.352 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:46:18 compute-0 nova_compute[192698]: 2025-10-01 14:46:18.268 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:46:21 compute-0 nova_compute[192698]: 2025-10-01 14:46:21.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:46:23 compute-0 nova_compute[192698]: 2025-10-01 14:46:23.271 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:46:26 compute-0 nova_compute[192698]: 2025-10-01 14:46:26.357 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:46:28 compute-0 podman[235603]: 2025-10-01 14:46:28.178462291 +0000 UTC m=+0.087040023 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 01 14:46:28 compute-0 podman[235604]: 2025-10-01 14:46:28.21840602 +0000 UTC m=+0.122571454 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_controller, managed_by=edpm_ansible)
Oct 01 14:46:28 compute-0 nova_compute[192698]: 2025-10-01 14:46:28.273 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:46:29 compute-0 podman[203144]: time="2025-10-01T14:46:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:46:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:46:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:46:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:46:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3034 "" "Go-http-client/1.1"
Oct 01 14:46:31 compute-0 nova_compute[192698]: 2025-10-01 14:46:31.359 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:46:31 compute-0 openstack_network_exporter[205307]: ERROR   14:46:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:46:31 compute-0 openstack_network_exporter[205307]: ERROR   14:46:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:46:31 compute-0 openstack_network_exporter[205307]: ERROR   14:46:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:46:31 compute-0 openstack_network_exporter[205307]: ERROR   14:46:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:46:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:46:31 compute-0 openstack_network_exporter[205307]: ERROR   14:46:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:46:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:46:33 compute-0 nova_compute[192698]: 2025-10-01 14:46:33.277 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:46:36 compute-0 podman[235646]: 2025-10-01 14:46:36.183790823 +0000 UTC m=+0.093204997 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., container_name=openstack_network_exporter, architecture=x86_64, com.redhat.component=ubi9-minimal-container, distribution-scope=public, name=ubi9-minimal, release=1755695350, vcs-type=git, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct 01 14:46:36 compute-0 nova_compute[192698]: 2025-10-01 14:46:36.362 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:46:38 compute-0 nova_compute[192698]: 2025-10-01 14:46:38.318 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:46:41 compute-0 nova_compute[192698]: 2025-10-01 14:46:41.365 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:46:42 compute-0 nova_compute[192698]: 2025-10-01 14:46:42.926 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:46:43 compute-0 podman[235668]: 2025-10-01 14:46:43.162161503 +0000 UTC m=+0.075761770 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0)
Oct 01 14:46:43 compute-0 podman[235669]: 2025-10-01 14:46:43.170579788 +0000 UTC m=+0.081416351 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.license=GPLv2, tcib_managed=true)
Oct 01 14:46:43 compute-0 nova_compute[192698]: 2025-10-01 14:46:43.321 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:46:43 compute-0 nova_compute[192698]: 2025-10-01 14:46:43.453 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:46:43 compute-0 nova_compute[192698]: 2025-10-01 14:46:43.453 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:46:43 compute-0 nova_compute[192698]: 2025-10-01 14:46:43.453 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:46:43 compute-0 nova_compute[192698]: 2025-10-01 14:46:43.453 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 01 14:46:43 compute-0 nova_compute[192698]: 2025-10-01 14:46:43.649 2 WARNING nova.virt.libvirt.driver [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:46:43 compute-0 nova_compute[192698]: 2025-10-01 14:46:43.651 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:46:43 compute-0 nova_compute[192698]: 2025-10-01 14:46:43.674 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.023s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:46:43 compute-0 nova_compute[192698]: 2025-10-01 14:46:43.675 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5824MB free_disk=73.29331970214844GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 01 14:46:43 compute-0 nova_compute[192698]: 2025-10-01 14:46:43.676 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:46:43 compute-0 nova_compute[192698]: 2025-10-01 14:46:43.676 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:46:44 compute-0 nova_compute[192698]: 2025-10-01 14:46:44.728 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 01 14:46:44 compute-0 nova_compute[192698]: 2025-10-01 14:46:44.729 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:46:43 up  1:46,  0 user,  load average: 0.06, 0.21, 0.23\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 01 14:46:44 compute-0 nova_compute[192698]: 2025-10-01 14:46:44.746 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:46:45 compute-0 nova_compute[192698]: 2025-10-01 14:46:45.279 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:46:45 compute-0 nova_compute[192698]: 2025-10-01 14:46:45.789 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 01 14:46:45 compute-0 nova_compute[192698]: 2025-10-01 14:46:45.790 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.113s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:46:46 compute-0 nova_compute[192698]: 2025-10-01 14:46:46.367 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:46:46 compute-0 nova_compute[192698]: 2025-10-01 14:46:46.789 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:46:46 compute-0 nova_compute[192698]: 2025-10-01 14:46:46.790 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:46:46 compute-0 nova_compute[192698]: 2025-10-01 14:46:46.790 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:46:47 compute-0 podman[235708]: 2025-10-01 14:46:47.184874856 +0000 UTC m=+0.098642863 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct 01 14:46:47 compute-0 nova_compute[192698]: 2025-10-01 14:46:47.926 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:46:48 compute-0 nova_compute[192698]: 2025-10-01 14:46:48.364 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:46:48 compute-0 nova_compute[192698]: 2025-10-01 14:46:48.914 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:46:51 compute-0 nova_compute[192698]: 2025-10-01 14:46:51.370 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:46:51 compute-0 nova_compute[192698]: 2025-10-01 14:46:51.926 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:46:53 compute-0 nova_compute[192698]: 2025-10-01 14:46:53.367 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:46:56 compute-0 nova_compute[192698]: 2025-10-01 14:46:56.372 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:46:57 compute-0 nova_compute[192698]: 2025-10-01 14:46:57.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:46:57 compute-0 nova_compute[192698]: 2025-10-01 14:46:57.926 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 01 14:46:58 compute-0 nova_compute[192698]: 2025-10-01 14:46:58.370 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:46:59 compute-0 podman[235732]: 2025-10-01 14:46:59.189043004 +0000 UTC m=+0.091338657 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, io.buildah.version=1.41.4)
Oct 01 14:46:59 compute-0 podman[235733]: 2025-10-01 14:46:59.23254559 +0000 UTC m=+0.132623874 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct 01 14:46:59 compute-0 podman[203144]: time="2025-10-01T14:46:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:46:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:46:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:46:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:46:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3029 "" "Go-http-client/1.1"
Oct 01 14:47:01 compute-0 nova_compute[192698]: 2025-10-01 14:47:01.373 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:47:01 compute-0 openstack_network_exporter[205307]: ERROR   14:47:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:47:01 compute-0 openstack_network_exporter[205307]: ERROR   14:47:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:47:01 compute-0 openstack_network_exporter[205307]: ERROR   14:47:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:47:01 compute-0 openstack_network_exporter[205307]: ERROR   14:47:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:47:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:47:01 compute-0 openstack_network_exporter[205307]: ERROR   14:47:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:47:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:47:03 compute-0 nova_compute[192698]: 2025-10-01 14:47:03.373 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:47:06 compute-0 nova_compute[192698]: 2025-10-01 14:47:06.435 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:47:07 compute-0 podman[235775]: 2025-10-01 14:47:07.182983443 +0000 UTC m=+0.089902619 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, name=ubi9-minimal, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.buildah.version=1.33.7, version=9.6, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct 01 14:47:08 compute-0 nova_compute[192698]: 2025-10-01 14:47:08.376 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:47:11 compute-0 nova_compute[192698]: 2025-10-01 14:47:11.437 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:47:13 compute-0 nova_compute[192698]: 2025-10-01 14:47:13.379 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:47:14 compute-0 podman[235797]: 2025-10-01 14:47:14.17780437 +0000 UTC m=+0.089893029 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest)
Oct 01 14:47:14 compute-0 podman[235796]: 2025-10-01 14:47:14.183078401 +0000 UTC m=+0.092923160 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Oct 01 14:47:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:47:14.332 103791 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:47:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:47:14.333 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:47:14 compute-0 ovn_metadata_agent[103777]: 2025-10-01 14:47:14.333 103791 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:47:16 compute-0 nova_compute[192698]: 2025-10-01 14:47:16.440 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:47:18 compute-0 podman[235837]: 2025-10-01 14:47:18.17654031 +0000 UTC m=+0.079552682 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 01 14:47:18 compute-0 nova_compute[192698]: 2025-10-01 14:47:18.383 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:47:21 compute-0 nova_compute[192698]: 2025-10-01 14:47:21.443 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:47:23 compute-0 nova_compute[192698]: 2025-10-01 14:47:23.385 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:47:26 compute-0 nova_compute[192698]: 2025-10-01 14:47:26.470 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:47:28 compute-0 nova_compute[192698]: 2025-10-01 14:47:28.387 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:47:29 compute-0 podman[203144]: time="2025-10-01T14:47:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:47:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:47:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:47:29 compute-0 podman[203144]: @ - - [01/Oct/2025:14:47:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3034 "" "Go-http-client/1.1"
Oct 01 14:47:30 compute-0 podman[235861]: 2025-10-01 14:47:30.179262848 +0000 UTC m=+0.082009667 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct 01 14:47:30 compute-0 podman[235862]: 2025-10-01 14:47:30.233586093 +0000 UTC m=+0.134346309 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct 01 14:47:31 compute-0 openstack_network_exporter[205307]: ERROR   14:47:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:47:31 compute-0 openstack_network_exporter[205307]: ERROR   14:47:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:47:31 compute-0 openstack_network_exporter[205307]: ERROR   14:47:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:47:31 compute-0 openstack_network_exporter[205307]: ERROR   14:47:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:47:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:47:31 compute-0 openstack_network_exporter[205307]: ERROR   14:47:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:47:31 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:47:31 compute-0 nova_compute[192698]: 2025-10-01 14:47:31.472 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:47:33 compute-0 nova_compute[192698]: 2025-10-01 14:47:33.391 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:47:36 compute-0 nova_compute[192698]: 2025-10-01 14:47:36.474 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:47:38 compute-0 podman[235905]: 2025-10-01 14:47:38.184361756 +0000 UTC m=+0.095837568 container health_status e86beb7951b8578787d8127e7fc3c669c9cd78e38d3b8bbf0eb6434f7958543c (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, distribution-scope=public, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.expose-services=, container_name=openstack_network_exporter, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct 01 14:47:38 compute-0 nova_compute[192698]: 2025-10-01 14:47:38.394 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:47:41 compute-0 nova_compute[192698]: 2025-10-01 14:47:41.509 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:47:42 compute-0 nova_compute[192698]: 2025-10-01 14:47:42.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:47:43 compute-0 nova_compute[192698]: 2025-10-01 14:47:43.396 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:47:43 compute-0 nova_compute[192698]: 2025-10-01 14:47:43.777 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:47:43 compute-0 nova_compute[192698]: 2025-10-01 14:47:43.777 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:47:43 compute-0 nova_compute[192698]: 2025-10-01 14:47:43.778 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:47:43 compute-0 nova_compute[192698]: 2025-10-01 14:47:43.778 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Oct 01 14:47:44 compute-0 nova_compute[192698]: 2025-10-01 14:47:44.007 2 WARNING nova.virt.libvirt.driver [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Oct 01 14:47:44 compute-0 nova_compute[192698]: 2025-10-01 14:47:44.009 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Oct 01 14:47:44 compute-0 nova_compute[192698]: 2025-10-01 14:47:44.036 2 DEBUG oslo_concurrency.processutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.027s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Oct 01 14:47:44 compute-0 nova_compute[192698]: 2025-10-01 14:47:44.036 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5822MB free_disk=73.29331970214844GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Oct 01 14:47:44 compute-0 nova_compute[192698]: 2025-10-01 14:47:44.037 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Oct 01 14:47:44 compute-0 nova_compute[192698]: 2025-10-01 14:47:44.037 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Oct 01 14:47:45 compute-0 podman[235928]: 2025-10-01 14:47:45.169040172 +0000 UTC m=+0.072005600 container health_status d29a5b2df3232de3fc8ffa11129eb74e1227c7ddc762fdb66a657a6a79e69cd0 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.build-date=20250930)
Oct 01 14:47:45 compute-0 podman[235927]: 2025-10-01 14:47:45.185798941 +0000 UTC m=+0.097858142 container health_status 393829e049dcc45d4a2e8a027b21ebd0d33435b43b6913b225fa2c66873a8da9 (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Oct 01 14:47:45 compute-0 nova_compute[192698]: 2025-10-01 14:47:45.482 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Oct 01 14:47:45 compute-0 nova_compute[192698]: 2025-10-01 14:47:45.483 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 14:47:44 up  1:47,  0 user,  load average: 0.02, 0.17, 0.22\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Oct 01 14:47:45 compute-0 nova_compute[192698]: 2025-10-01 14:47:45.526 2 DEBUG nova.compute.provider_tree [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed in ProviderTree for provider: ee1e54f5-453b-4949-a499-9a192f03b8f0 update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Oct 01 14:47:46 compute-0 nova_compute[192698]: 2025-10-01 14:47:46.045 2 DEBUG nova.scheduler.client.report [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Inventory has not changed for provider ee1e54f5-453b-4949-a499-9a192f03b8f0 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Oct 01 14:47:46 compute-0 nova_compute[192698]: 2025-10-01 14:47:46.511 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:47:46 compute-0 nova_compute[192698]: 2025-10-01 14:47:46.638 2 DEBUG nova.compute.resource_tracker [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Oct 01 14:47:46 compute-0 nova_compute[192698]: 2025-10-01 14:47:46.639 2 DEBUG oslo_concurrency.lockutils [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.602s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Oct 01 14:47:48 compute-0 nova_compute[192698]: 2025-10-01 14:47:48.397 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:47:49 compute-0 podman[235969]: 2025-10-01 14:47:49.179795425 +0000 UTC m=+0.091845301 container health_status a05d4d56c1d0e505d12e82b0400b06fd268296d83265263761e035227da0b74e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct 01 14:47:49 compute-0 nova_compute[192698]: 2025-10-01 14:47:49.639 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:47:49 compute-0 nova_compute[192698]: 2025-10-01 14:47:49.639 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:47:49 compute-0 nova_compute[192698]: 2025-10-01 14:47:49.640 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:47:49 compute-0 nova_compute[192698]: 2025-10-01 14:47:49.640 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:47:49 compute-0 nova_compute[192698]: 2025-10-01 14:47:49.915 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:47:51 compute-0 nova_compute[192698]: 2025-10-01 14:47:51.514 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:47:51 compute-0 nova_compute[192698]: 2025-10-01 14:47:51.913 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:47:53 compute-0 sshd-session[235993]: Accepted publickey for zuul from 192.168.122.10 port 53272 ssh2: ECDSA SHA256:G/wBH4NemtaB5A4Xrsc6R+GZmi6HC8VbviS/FKhdd8M
Oct 01 14:47:53 compute-0 systemd-logind[791]: New session 38 of user zuul.
Oct 01 14:47:53 compute-0 systemd[1]: Started Session 38 of User zuul.
Oct 01 14:47:53 compute-0 sshd-session[235993]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Oct 01 14:47:53 compute-0 nova_compute[192698]: 2025-10-01 14:47:53.400 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:47:53 compute-0 sudo[235997]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp -p container,openstack_edpm,system,storage,virt'
Oct 01 14:47:53 compute-0 sudo[235997]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Oct 01 14:47:53 compute-0 nova_compute[192698]: 2025-10-01 14:47:53.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:47:56 compute-0 nova_compute[192698]: 2025-10-01 14:47:56.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:47:58 compute-0 nova_compute[192698]: 2025-10-01 14:47:58.403 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:47:58 compute-0 ovs-vsctl[236171]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Oct 01 14:47:58 compute-0 nova_compute[192698]: 2025-10-01 14:47:58.925 2 DEBUG oslo_service.periodic_task [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Oct 01 14:47:58 compute-0 nova_compute[192698]: 2025-10-01 14:47:58.925 2 DEBUG nova.compute.manager [None req-3be5e3a0-dd9f-4273-b767-78aee17f656e - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Oct 01 14:47:59 compute-0 virtqemud[192597]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Oct 01 14:47:59 compute-0 virtqemud[192597]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Oct 01 14:47:59 compute-0 virtqemud[192597]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Oct 01 14:47:59 compute-0 podman[203144]: time="2025-10-01T14:47:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct 01 14:47:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:47:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19527 "" "Go-http-client/1.1"
Oct 01 14:47:59 compute-0 podman[203144]: @ - - [01/Oct/2025:14:47:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3029 "" "Go-http-client/1.1"
Oct 01 14:48:00 compute-0 podman[236476]: 2025-10-01 14:48:00.317190547 +0000 UTC m=+0.088181323 container health_status 3acc2c427d38ed8582691cf9c9dbcd471bde1cd84b7930615abf74b558d560a3 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Oct 01 14:48:00 compute-0 podman[236518]: 2025-10-01 14:48:00.417679548 +0000 UTC m=+0.098555260 container health_status ded1304d49d8a5b5de237605e051218f4b191ffb97e5fa502978506a74f3995f (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Oct 01 14:48:00 compute-0 crontab[236625]: (root) LIST (root)
Oct 01 14:48:01 compute-0 openstack_network_exporter[205307]: ERROR   14:48:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:48:01 compute-0 openstack_network_exporter[205307]: ERROR   14:48:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Oct 01 14:48:01 compute-0 openstack_network_exporter[205307]: ERROR   14:48:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Oct 01 14:48:01 compute-0 openstack_network_exporter[205307]: ERROR   14:48:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Oct 01 14:48:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:48:01 compute-0 openstack_network_exporter[205307]: ERROR   14:48:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Oct 01 14:48:01 compute-0 openstack_network_exporter[205307]: 
Oct 01 14:48:01 compute-0 nova_compute[192698]: 2025-10-01 14:48:01.518 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:48:03 compute-0 systemd[1]: Starting Hostname Service...
Oct 01 14:48:03 compute-0 systemd[1]: Started Hostname Service.
Oct 01 14:48:03 compute-0 nova_compute[192698]: 2025-10-01 14:48:03.406 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Oct 01 14:48:06 compute-0 nova_compute[192698]: 2025-10-01 14:48:06.521 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
